Research On EEG Based Emotion Recognition Using Deep Learning Method | | Posted on:2021-12-07 | Degree:Doctor | Type:Dissertation | | Country:China | Candidate:J X Chen | Full Text:PDF | | GTID:1520307100474624 | Subject:Computer Science and Technology | | Abstract/Summary: | PDF Full Text Request | | Emotion plays a crucial role in human life.The study of emotion recognition using computer is becoming a popular research topic in affective computing and pattern recognition fields.As a typical physiological signal,electroencephalograph(EEG)has been widely applied for emotion recognition in recent years.In dealing with EEG based emotion recognition problems,there are two major technical challenges encountered: one is how to extract more discriminative emotional features from EEG signals,and the other is how to develop more effective computational model for emotion recognition.Traditional EEG emotion recognition methods are mostly focused on applying shallow machine learning classification algorithm with artificial feature engineering.It is difficult to jointly extract high level spatial-temporal relevant features from raw EEG signals,escpecially when training large-scale datasets,it usually exausts much time and memory,resulting the recognition speed and accuracy cannot meet the needs of online applications.Although the deep learning method can self-learn high-level abstract emotional features from large-scale raw EEG data end-to-end,its emotion classification performance is still unsatisfactory due to the low signal-to-noise ratio and instability of EEG signal itself.Therefore,under the inspiration and guidance of neurophysiology and psychology related theory,the article aims to study EEG based emotion recognition with deep learning method to improve its emotion recognition accuracy and robustness.The detail research work is as follows:1)A simple and effective combination feature representation method is proposed,which concates the temporal and frequencial EEG features together and contains more emotion correlated context information from time and frequency domain.The deep convolutional neural networks with a varienty of convolution kernels similar to those used in computer vision are constructed to extract high-level discriminant emotional EEG features from global and local space and time domain and then make emotion classification.The shallow machine learning models including bagging tree,support vector machine,linear discriminant analysis and bayesian linear discriminant analysis algorithm and three CNN deep learning models are used to carry out binary emotion classification experiments on EEG features of DEAP dataset in valence and arousal dimensions.The results show that the proposed deep CNN models always present the best and stable performance on the temporal and frequencial combination EEG features in valence and arousal binary emotion classification.2)A hierarchical bidirectional gated recurrent unit(GRU)model with attention for emotion classification from continues EEG signals is proposed.The structure of the model mirrors the hierarchical structure of EEG signals,and the attention mechanism is used at two levels of EEG samples and epochs.By paying different levels of attention to content with different importance,the model can learn more significant feature representation of EEG sequence which highlights the contribution of important samples and epochs to its emotional category.The proposed model is evaluated by binary valence and arousal emotion classification experiments on EEG frequency features of DEAP dataset in cross-subject scenario.The results of experiments indicate that the classification accuracy of our proposed model in valence and arousal is 7.43%and 7.6% higher than that of the best baseline shallow classifier,and 6.7% and 7.52% higher than that of the best referenced deep neural network,respectively.It demonstrates that the model can learn more significant long-term dependence in an EEG sequence according to the context weights,therefore overcome the time nonstationarity of EEG signals and become more robust.3)A new EEG data representation is proposed,which transforms 1D chain-like EEG vector sequence into 2D mesh-like matrix sequence.The mesh structure of the matrix at each time point corresponds to the distribution of EEG electrodes,which could better represent the spatial correlation of EEG signals among multiple physically adjacent electrodes.Then,the sliding window is used to divide the 2D matrix sequence into segments containing equal time points,and each segment is seen as an EEG sample integrating the temporal and spatial context information.Both cascaded and parallel hybrid convolution recurrent neural networks are then proposed to accurately predict the emotional category of each EEG sample.In these two hybrid networks,CNN is used to learn the spatial correlation between physically adjacent EEG signals from the converted 2D mesh-like matrix representation,and LSTM is used to learn the temporal dependency between time points in the sequence.Extensive within-subject binary emotion classification experiments in valence and arousal are carried out on DEAP dataset to evaluate the proposed method.The experimental results demonstrate that the classification accuracies of both hybrid networks on 2D EEG meshes achieve over 93%,which outperform the most recent baseline methods and other deep learning models.4)Inspired by neuroscience research with respect to different brain region response to different emotions,a novel EEG based emotion recognition method denoted by R2G-STBi LSTM is proposed.The method consists of spatial and temporal bidirectional long short-term memory(Bi LSTM)network models with regional to global hierarchical feature learning process to extract discriminative spatial-temporal EEG features.To learn the spatial features,a Bi LSTM network is applied to capture the intrinsic spatial correlations of EEG electrodes within each brain region and between different brain regions,respectively.Considering that different human emotion is associated with different brain regions,we introduced a regionattention layer in the spatial Bi LSTM network to learn a set of weights to strengthen or weaken the contributions of these brain regions.Based on the spatial feature sequences,another Bi LSTM model is adopted to learn both regional and global spatial-temporal features and the features are then fed into a classifier layer to learn discriminative emotional features,in which a domain discriminator working corporately with the emotion classifier is applied to decrease the domain shift between training and testing data.Finally,both within-subject and crosssubject EEG emotion recognition experiments are carried out on DEAP database to evaluate the proposed method,and the experimental results show that the proposed method achieves state-of-the-art performance in positive,neutral and negative emotion recognition. | | Keywords/Search Tags: | EEG, Emotion recognition, Deep learning, Spatial-temporal feature, Convolutional neural network, Recurrent neural network, Hybrid neural network, Bidirectional LSTM, GRU, Attention, Adversarial training | PDF Full Text Request | Related items |
| |
|