| Alzheimer’s Disease(AD)is a neurodegenerative brain disease that is clinically manifested as memory impairment,loss of movement,and language ability.AD can be divided into mild cognitive impairment(MCI),normal control(NC),and AD based on clinical symptoms.MCI is an intermediate state between AD and NC.It is the prodromal stage of AD.The clinical symptoms of MCI are not obvious,and it is not easy to be detected in the initial stage of the disease.Under normal circumstances,MCI will transform into AD within 5 years.AD is a chronic disease that affects the health of the elderly.It is generally in the late stage when it is discovered.Current medical methods cannot be cured,so they can only be detected and treated early.This article is based on residual neural network,using brain imaging omics data for AD related diagnostic research.The specific research content is as follows:(1)A Residual Neural Network(Res Net)fusion of Selective Kernel Networks(SKNet)and Shuffle Net’s Enhanced Residual Neural Network(ERes Net)is proposed for AD diagnosis.This network model replaces the3×3 convolution kernel in the Res Net residual block with SKNet so that the network can adaptively adjust the size of the receptive field according to the size of the input image information to improve the performance of the network,and at the same time to reduce the number of model parameters Use group convolution to replace all ordinary convolutions.Finally,adding Channel Shuffle to the residual block allows information exchange between different feature subgroups,enriching the feature hierarchy to improve the identification ability of the network,and performing residual learning on the CNN to avoid network degradation.Experimental results on Gray Matter(GM)data show that the model has a better classification effect,and the amount of parameters is much lower than Res Net-50.(2)A Multi-Scale Convolutional Neural Network(MSCNet)model is proposed for AD diagnosis.Ordinary Convolutional Neural Network(CNN)extracts image features in a layer-by-layer abstraction,and the observed features change with the receptive field.However,the brain structure is more complicated,and the cause of AD is unknown.If each layer of the network only extracts single-scale features,it cannot take into account the local and global information of the image.To solve this problem,a new multi-scale structure is proposed to enhance the feature representation ability of the model.Also,the paper also proposes an improved channel attention mechanism to improve the interdependence between channels,and adaptively recalibrate the characteristic response of the channel direction to enhance the performance of the model.Experimental results on white matter(WM)and gray matter data show that the proposed MSCNet has better performance,fewer parameters,and multiplications,and significantly improves the classification accuracy.At the same time,it also verified that white matter is more discriminative than gray matter in the early diagnosis of AD.At the end of this paper,the proposed MSCNet model is combined with a GUI interface design to develop an AD diagnostic auxiliary medical system,and a good application result is obtained. |