With the rapid development of deep learning and the skyrocketing demand for intelligent security,face recognition technology is also developing rapidly.Face recognition is a hot research in the field of image processing,and the traditional technology mainly relies on manual feature extraction methods to achieve recognition,and the accuracy and efficiency are not high,so that the development is limited.The face recognition technology based on convolutional neural network automatically extracts features through the network structure,and the features have higher expression ability,so the accuracy of face recognition is improved.However,in practical applications,face recognition is mainly to achieve information comparison through the comparison of features,so as to achieve the matching of face information and people.Based on convolutional neural networks,this paper introduces attention mechanism and transfer learning to study face recognition,and the main work is as follows:(1)A face recognition method based on transfer learning and improved loss function is proposed.This method uses MobileNetV1 as the network backbone structure,and selects Angular Softmax,AM-Softmax,and ArcFace loss functions to optimize the loss function of the network model according to the different effects of the face loss function,and it is also proved that the improvement of the loss function is very helpful for the recognition effect of the network model.It is found that the improvement effect of using the ArcFace loss function is the best.Finally,by loading the pre-trained model on ImageNet for transfer learning training,it is found that the method of transfer learning can effectively shorten the training time and accelerate the convergence speed,and the effect of the model is also improved.(2)A face recognition method based on transfer learning and attention mechanism is proposed.This method uses MobileNetV2 as the network backbone structure,and at the same time improves MobileNetV2,first replacing the ReLU function with the LeakyReLU activation function,followed by improving the loss function,using the AdaCos loss function,that is,adaptive scaling cosine.Then is to introduce the attention mechanism in the network,insert the mixed attention mechanism module into the first convolution of the network,and at the same time add the mixed attention mechanism module between the last inverted residual module and the average pooling layer,and the ablation experiment proves that each improvement is improved on the network model,especially in the last introduction of the attention mechanism,the model effect is the best.Finally,the transfer learning method is used to train on the self-made dataset True-Face,and the experimental results show that the model trained by the transfer learning method not only greatly improves the convergence speed,but also improves the effect of the model. |