Font Size: a A A

Recursive Network Based On Multiscale Pooling Attention And Hierarchical Feature Fusion For Image Super-Resolution

Posted on:2021-01-13Degree:MasterType:Thesis
Country:ChinaCandidate:B ZhengFull Text:PDF
GTID:2428330611466440Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
Algorithms for super-resolution(SR)image reconstruction could recover high-resolution(HR)images from their low-resolution(LR)versions to obtain higher quality signals,without upgrading the imaging system.Due to the significance of some selective attentions in human visual system,a super-resolution network with attention augmentation can achieve better performance.Our research is focused on the attention mechanism combined with a lightweight recursive network and a deep convolution network in SR reconstruction.The main work in this dissertation is as follows:1.We propose a multi-scale pooling channel attention mechanism and construct a lightweight recursive network(RN)based on non-global shared parameters.First,instead of the global pooling in channel attention,a multi-scale pooling module is used to learn the structural features in spatial domain for channel-wised attention.Thus,channel and spatial pooling features jointly determine the weights of channel features and distinguish salient features from the redundant ones.Second,to filter features at different layers,its parameters of the channel attention module are not shared.Whereas,the RN module is parameter-sharing with residual and dense connections,where the residual connection accelerates convergence and the dense connection multiplex feature maps of each layer.In addition,a bottleneck layer is inserted to keep the dense network light-weighted.Experiments show that compared with seven other lightweight models,this multi-scale pooling attention lightweight RN performs better on standard datasets in PSNR / SSIM,when a similar number of parameters is taken.2.We propose a new method for attention weight assignment which balances the result of local layered-feature fusion and the global importance.The former applies an attention module to mix features from the current layer and from several adjacent layers,transferring weights in other related layers.The latter would focus on the inconsistency between local layered saliency and global structured saliency,by picking out missing information in low-frequency and in high-frequency to adjust attention weights respectively.In general,this local-global attention method is indeed a 3-dimension weight adjustment scheme,with feature maps modified channel-wised,important spatial regions gathered and hierarchical features fusion.Experimental results show that it is superior to other attention-based networks and advanced networks in reconstructing images in PSNR / SSIM on standard data sets.
Keywords/Search Tags:deep super-resolution network, multi-scale pooling channel attention mechanism, non-global shared, hierarchical feature fusion, mixed attention
PDF Full Text Request
Related items