Font Size: a A A

Research On The Interpolation Method For Generating Asymmetric Motion Fields Based On Events

Posted on:2024-03-22Degree:MasterType:Thesis
Country:ChinaCandidate:Y J ZhangFull Text:PDF
GTID:2568307064496964Subject:Master of Computer Technology
Abstract/Summary:PDF Full Text Request
Video interpolation,the synthesis of intermediate frames between existing frames of a video,is widely used for frame rate conversion,slow motion generation and new view synthesis.Video interpolation generally employs a method that uses optical flow approximation to estimate inter-frame motion,and then interpolates to synthesize intermediate frames by motion compensation.However,due to the luminance invariance assumption and linear motion assumption,traditional video interpolation methods are difficult to generate accurate dense optical flow,which greatly affects the quality of generated intermediate frames.In recent years,deep learning has developed at a high speed,and convolutional neural networks have achieved great success in the fields of motion estimation and image synthesis,which have a great impact on the field of video interpolation.Current deep learning-based video frame interpolation methods approximate inter-frame motion by using convolutional neural networks to generate inter-frame optical flow,i.e.,inter-frame motion field.However,the limited information that two boundary frames can provide and the inability to accurately estimate large displacements and nonlinear motion changes by only the corresponding pixels of two boundary frames limit its application in high-speed and complex motion scenes.Event cameras are bio-inspired vision sensors that can respond asynchronously to the luminance changes of each pixel and output a stream of events with high temporal resolution and low latency.Using the motion information contained in events,this paper proposes two algorithms for generating asymmetric motion fields based on events,which can accurately estimate inter-frame motion.By combining events,the three video interpolation methods proposed in this paper are important to solve the main problems of inaccurate estimation of nonlinear motion and large displacements in the field of video interpolation.The main research contents and contributions of this paper are as follows:(1)A detailed introduction to the background and development of the techniques related to video interpolation,a step-by-step extension from image-based motion approximation methods to asymmetric motion approximation methods,and finally a derivation of the principles of two event-based asymmetric motion approximation methods.(2)A two-stage asymmetric motion estimation network based on event synthesis frames is proposed.Firstly,a one-stage intermediate frame is synthesized by the synthesis module using the event information and the boundary frame information;secondly,the bidirectional optical flow from the intermediate frame to the two boundary frames is estimated by the bilateral motion estimation module;finally,the intermediate frame is generated by motion compensation.(3)A one-stage lightweight asymmetric motion estimation network based on event and frame fusion is proposed.Using event and frame synthesis as input,the bidirectional optical flow from the intermediate frame to the two boundary frames is estimated directly by the composite bilateral motion estimation module;the intermediate frames are then generated by motion compensation.(4)By combining the above two algorithms,an event-based dual-stream video interpolation algorithm is proposed to dynamically fuse the results of multiple stages by using dynamic convolution to finally generate extremely high-quality intermediate frames.(5)Validation on Vimeo90 K dataset,UCF101 dataset,and X4K1000 FPS dataset for large size images,HSERGB dataset for event image dataset,which are commonly used in the field of video frame interpolation.The experiments show that the proposed method has the best visual effect and effectively alleviates the motion blur caused by nonlinear motion.On the video interpolation datasets Vimeo90 K and UCF101,the method in this paper achieves PSNR improvements of 0.5dB and 0.24 dB,respectively.
Keywords/Search Tags:Event Camera, Video Frame Interpolation, Optical Flow Estimation
PDF Full Text Request
Related items