Font Size: a A A

Emotion Recognition Based On Deep Neural Networks

Posted on:2022-07-01Degree:MasterType:Thesis
Country:ChinaCandidate:J L ZhuFull Text:PDF
GTID:2518306524490704Subject:Master of Engineering
Abstract/Summary:PDF Full Text Request
Emotion recognition is important for understanding human emotion,which has been widely used in human-computer interaction,distance education,health care.Nowadays it is convenient for signal acquisition with the development of equipment.However,how to effectively extract emotional characteristics and improve accuracy is still an urgent challenge.In this thesis,emotion recognition models based on text,electroencephalogram(EEG)and multimodal fusion were proposed respectively to improve the performance in different scenarios.Additionally,a hypertension management system based on emotion recognition was designed for elderly healthcare.The specific research contents and conclusions are as follows:1.Word embedding lack of context information and models which ignored clause characteristics were normally existing in sentiment analysis.We proposed an entitydependent sentiment analysis method based on clause features.The model firstly segmented text into clauses and used ELMo to achieve word embedding.Then word-level and clause-level attention mechanism was implemented in Bi LSTM to extract sentence features.Finally,softmax layer was used in aspect sentiment classification.The proposed model was validated on Semeval2014 Task4 dataset,and the accuracy of Laptop and Resaurant was 74.32% and 79.73%.2.The loss of spatial features in adjacent channel is still salient in EEG-based emotion recognition.We proposed a novel model named spatial-temporal core block convolutional neural network(STCB-CNN).The spatial-temporal core block was used to extract the characteristics of EEG signal in temporal dimension and spatial dimension.Then the features were connected with full connection layer to realize classification.The model was finally validated on DEAP dataset,and the accuracy of arousal and valence was 87.56% and 88.61%,which demonstrated the effectiveness of our model.3.Multi-modal fusion emotion recognition ignored the specificity of individual emotion expression and it is difficult to achieve different modal fusion.To solve these problems,we proposed a novel multi-modal fusion method called bimodal deep autoencoder modal(BDAE),which combined EEG,EMG,EOG,GSR and text data.Firstly,we extracted deep features and statistical features of signals,and then integrated demographic features into BDAE to achieve feature extraction.Finally,we used support vector machine to classify human emotion.The model was validated in DEAP dataset,and the accuracy of arousal and valence was 90.21% and 91.82%.Additionally,we also conducted 4-class experiments on DEAP,which the result reached to 88.20%.4.We designed and implemented a hypertension management system based on emotion recognition.The system with My SQL database is developed based on Spring Boot and My Batis framework.It realized eight modules of data collection,data display,emotion recognition,emotion intervention,patient management,statistical analysis and system management.The platform realized real-time risk monitoring and emotion management for hypertension patients,which can effectively reduce the risk of patients.
Keywords/Search Tags:Emotion Recognition, Text, EEG, Multimodal Fusion, Deep Learning
PDF Full Text Request
Related items