The purpose of emotion state recognition is to give computers the ability to analyze and understand human emotions and intentions,and to analyze human mental activities at a deep level so that they can be useful in fields such as entertainment,education,and intelligent healthcare.Based on the categories of expressions,emotions can be classified into seven emotions: happy,surprised,sad,angry,disgusted,scared,and contempt.Based on these seven emotions,this paper will rely on the video action recognition and intelligent analysis project of the subject group to research the recognition of emotional states from the perspective of dynamic facial expression similarity.This paper estimates the intensity of facial expressions by training an expression intensity assessment model using convolutional neural networks,and on this basis,a study of the process of emotional change based on dynamic expression similarity was proposed.The work of the this paper is as follows:(1)Expression intensity estimation based on convolutional neural networks.In order to study the differences in expression intensity and to study the changes in emotional intensity according to the dynamic changes in expression intensity,this paper constructs an expression intensity estimation model using the Siamese networks.Firstly,the problem of estimating expression intensity is converted into a sorting problem based on the sequential relationship between expression sequences,and then the data is annotated according to the sorting relationship,and the sorting problem is converted into a classification problem based on the annotation,and expression sequences are extracted from the video according to the expression intensity values.(2)Study on emotional state change based on dynamic expression similarity.Based on the study of expression intensity combined with dynamic expression similarity to study the emotional change state,the specific implementation consists of four basic steps: expression feature extraction,expression feature dimensionality reduction,expression intensity estimation,and dynamic face similarity matching.This paper uses deep learning to extract the features of expression sequences,and then adopts KPCA dimensionality reduction for the extracted high-dimensional expression features,which maps the original non-linear expression features to a high-dimensional or even infinite multidimensional space to make the feature data linearly separable,and performs principal component analysis in the transformed space to obtain low-dimensional expression features,which can effectively reduce the complexity of dynamic expression complexity of the similarity calculation.Subsequently,the convolutional neural network-based approach is adopted for the emotion intensity estimation in Chapter 3,and the change curve of emotional intensity is obtained according to the expression intensity,and the change process of its emotional intensity is judged through the comparison of intensity curves.Finally,the feature similarity of dynamic expressions is calculated by the dynamic face matching algorithm based on cosine similarity,the process of their emotional change is studied and their emotions are identified through the feature similarity combined with the similarity of the intensity curves.(3)Development of an emotion recognition system.Based on the first two research points,and combined with the knowledge related to the front and back ends,an emotion recognition system was designed and implemented,which can not only upload videos from local for emotion recognition but also capture videos of dynamic expression changes in real-time through the camera for real-time recognition of emotions.The system can quickly recognize emotional states and is highly practical for emotion recognition. |