Font Size: a A A

Research On Lightweight Image Super-resolution Network Based On Residual Channel Attention

Posted on:2022-06-29Degree:MasterType:Thesis
Country:ChinaCandidate:Z Q LiuFull Text:PDF
GTID:2518306602966119Subject:Communication and Information System
Abstract/Summary:PDF Full Text Request
In recent years,the technology of image super-resolution reconstruction has developed rapidly due to the rapid development of deep learning technology,and the relevant indicators of super-resolution reconstruction have also achieved more than traditional algorithms in the tests of public dataset.However,with the continuous development of convolutional neural network,the depth and width of the network model are also increasing,resulting in a large amount of calculation and parameters in its forward calculation,which is difficult to be applied in practical engineering.In view of the high complexity of single image superresolution network model based on convolutional neural network,this paper mainly studies the design of lightweight single image super-resolution network model.This paper focuses on the research of single image super-resolution,and designs a network model based on residual nested link by combining residual network structure and channel attention module.Then,by optimizing the residual connection in the network model,design and select the appropriate sub-modules,and a lightweight network model with less computation and parameters is obtained.Then the channel pruning algorithm is used to further reduce the amount of calculation and parameters of the network model.Finally,in order to make up for the influence of the reduction of the computational load and the size of the model itself,we use the model with large amount of computation and parameters to supervise the lightweight hyper partition network,knowledge distillation algorithm,to improve the reconstruction accuracy of the model.The contents and innovations of this paper are as follows:(1)This paper investigates the research status at home and abroad,analyzes the characteristics of single image super division technology and convolutional neural network compression,and expounds the shortcomings of the existing super division network structure in practical application and the problems to be solved.(2)Based on the traditional residual structure and attention module,a circular nested image super score model is constructed.Then,the weighted residual channel attention hyperscore network(RNANm * n)is obtained by quantitatively analyzing the hyperscore reconstruction indexes of different number of substructure models.The computational complexity of the model RNAN3 * 3 is 138.3G,and the parameter complexity is 968.8k.(3)Based on the network structure designed in(1),the importance of neurons or weight links is evaluated,and the unimportant neurons or weight links are deleted.Then the network structure is retrained through fine-tuning to get new weight parameters.Finally,the pruned lightweight super score network is obtained.After pruning,the amount of network computation and parameters are reduced by 22.19% and 23.77% respectively.(4)Aiming at the lightweight hyperscore network designed in(1),a knowledge distillation algorithm is designed to supervise the student model based on the reconstruction results of teacher model.Then,by comparing the results before and after distillation,we analyze the effectiveness of knowledge Distillation Algorithm in the task of hyper division.The PSNR value of hyper division network after distillation is improved by 0.26 db.
Keywords/Search Tags:SISR, Residual blocks, Attention blocks, Knowledge Distillation, Network Pruning
PDF Full Text Request
Related items