Font Size: a A A

Multimodal Emotion Recognition Research Based On EEG,Peripheral Physiological Signals And Facial Expressions

Posted on:2024-01-08Degree:MasterType:Thesis
Country:ChinaCandidate:J LiFull Text:PDF
GTID:2530307136988019Subject:Signal and Information Processing
Abstract/Summary:PDF Full Text Request
With the continuous development of human-computer interaction technology,researchers have found that emotion recognition plays an important role in the process of human-computer interaction.In existing studies,emotion recognition methods are mostly based on expression,speech and gesture signals,which are subjective in nature.In contrast,emotion recognition methods based on EEG signals have greater advantages in recognition rate,and these methods use some non-physiological signals,such as EEG signals and ECG signals,for research.The main research of this paper is to explore the emotion recognition methods based on EEG signals,which is summarized as follows:(1)The EEG signals are temporal and spatial features of neuronal activities recorded by multiple electrodes,and their data complexity is high,and the requirements for their emotion recognition methods are more stringent.Therefore,this paper proposes a De Dc-LSTM,an EEG emotion recognition network based on multidimensional feature extraction,to extract and classify rich emotion information from EEG signals,which learns features from EEG signals in three dimensions: channel domain,spatial domain and temporal domain.In the channel domain,the network assigns different weights to different channels of EEG signals and adaptively adjusts the degree of channel contribution to the accuracy of emotion classification;in the spatial domain,the network uses the input signal spatial size weight information and the convolutional kernel weight information,adopts different attention weight calculation mechanisms,assigns dynamic characteristics to the convolutional kernel,and then extracts the spatial features of EEG signals;in the temporal domain,the network selects In the temporal domain,Long Short-Term Memory(LSTM)is used to learn the time series features of EEG signals.In this paper,the DEAP and SEED datasets are used to experiment the network,and the experiments show that the network structure can learn the EEG signal features comprehensively and obtain higher recognition rate.(2)To address the diversified characteristics of emotion expression,in order to better integrate EEG,peripheral physiological signals and facial expression signal features,so as to improve the accuracy of multimodal emotion recognition,this paper introduces the idea of attention mechanism and segmental transformation,and proposes a multi-stage attention network NMSNet for multimodal emotion recognition.The input layer of the overall network consists of extracted unimodal features,and convolutional neural network is used to extract the emotional features of EEG signals and peripheral physiological signals,and long and short term memory network is used for facial emotional features extraction.In the NMSNet network,the NAM residual attention mechanism module selects the unimodal features and sends them to the multi-headed mutual attention mechanism module for feature fusion,followed by cascading the three sets of fused features obtained after the two fusion of unimodal features,and finally sending the cascaded features to the segmented wavelet attention mechanism module for filtering,and then realizing feature weight redistribution.The NMSNet network can effectively solve the fusion problem of multimodal features,and also model the association between different modalities,which further improves the accuracy of emotion recognition.The network is experimented on DEAP dataset and MAHNOB-HCI dataset.The experimental results show that the network has a higher recognition rate compared to unimodal emotion recognition methods and also shows significant superiority in comparison with existing multimodal emotion recognition methods.In addition,the network also shows excellent generalization ability,which provides strong support for practical applications.
Keywords/Search Tags:EEG signal, peripheral physiological signal, attention mechanism, multimodal fusion
PDF Full Text Request
Related items