| Breast cancer is one of the highest incidence rate and death rate of women.There are many ways to detect and diagnose breast cancer.Biopsy based on pathological images is the primary detection method for breast cancer.Computer aided detection and diagnosis system based on artificial intelligence has become a research hotspot.It can improve the accuracy of breast cancer diagnosis,reduce the diagnostic time,and avoid doctors misdiagnosis and missed diagnosis.The auxiliary classification of breast pathological images has greatly improved the detection rate of breast cancer.Deep learning overcomes the shortcomings of traditional classification methods that need to extract lesion features manually.It can autonomously learn the characteristics of breast cancer foci.This paper studies a deep learning model for breast cancer image classification combining visual attention mechanism and transfer learning.The main work and innovations of this paper are as follows:1.The sample size of breast histopathological images is small.Therefore,it causes the problems of insufficient training,image level classification is difficult to focus on the focus area,and the classification effect is difficult to meet the high standard clinical needs.This paper proposes a dual model fusion classification model based on attention mechanism and deep transfer learning.The deep migration learning method transfers the general features to the breast cancer image classification task.The pre training network of VGG16 and Res Net50 is used as a feature extractor to extract shallow features.The feature Migration Experiment of breast cancer image was carried out through model adjustment.This can improve the efficiency of network training and classification effect.Then,by introducing spatial attention and channel attention,the regional information of breast lesions is extracted to enhance the feature description.Finally,the final classification result is obtained by fusing the class probability prediction vectors of the two classifiers through the soft voting algorithm.The classification binary accuracy of the model reached 99.45%,and the eight-classification accuracy reached99.11%.It is superior to some existing breast cancer image classification algorithms in terms of Precision and Recall.2.In view of the negative migration of dissimilar fields in the classification model of one migration,the inability to fully extract the high-level semantic information of breast tissue,and the unsatisfactory classification results of eight subtypes.In this paper,a secondary transfer fusion classification model combining residual visual attention mechanism and spatial pyramid pooling is proposed.The central idea is to use the breast histopathology data set Bach as the intermediate domain to make the network have a priori knowledge of breast images.Then the pre training model after one migration is migrated to the target domain again.The negative migration phenomenon caused by the large gap between the source domain and the target domain is avoided.It makes full use of the general features of public data sets and the high-level features of breast pathology.Based on the residual module,the channel attention sub module and the spatial attention sub module with extended convolution are introduced.This module can relieve the pressure of model training and learn more complex high-frequency information.Pyramid pooling module is added at the end of the network to capture the global information under different receptive fields.In this way,we can fully learn the multi-scale features under different magnification.The results of the experiment show that the ability of the second classification of attention pool is stronger than that of the second classification of attention pool.The accuracy of binary classification is 99.65%,and that of eigh-classification is95.68%. |