Font Size: a A A

Eeg Emotion Recognition Methods Based On Temporal-Frequency-Spatial Multidimensional Fusion

Posted on:2024-06-19Degree:MasterType:Thesis
Country:ChinaCandidate:F C CuiFull Text:PDF
GTID:2530307136993599Subject:Master of Electronic Information (Professional Degree)
Abstract/Summary:PDF Full Text Request
Emotion is the physiological or psychological reaction of human beings to external stimuli.As a physiological signal with strong objectivity,EEG signals have the characteristics of easy extraction and non-deception.Research on emotion recognition based on EEG signals has become a hot topic in the fields of computer science,neuroscience,and psychology.At present,mainstream emotion recognition methods are based on deep learning methods,such as convolutional neural networks,recurrent neural networks,and graph neural networks.Deep learning has a strong ability to express features and can process large-scale EEG data.The deep neural network can automatically and comprehensively extract the deeper details of emotions,analyze the spatial structure information,temporal information of the EEG signals,and obtain a higher accuracy of emotion recognition,but further improvement is needed in the analysis of the neural mechanism of the brain network and the fusion of multimodal features.In this thesis,based on the deep learning method,the fusion of temporal-frequency-spatial multidimensional features is carried out,the emotional information of EEG is deeply mined,and a total of three emotion recognition methods are proposed.(1)The first model is called DE-CNN-BiLSTM,which is a new EEG emotion classification method that integrates the complexity-spatial-temporal features of EEG emotion signals.Traditional EEG emotion classification models usually use time-domain or frequency-domain methods to extract signal features.This model uses differential entropy to extract signal features,which can better characterize the complexity of EEG features and obtain more comprehensive and detailed emotional information.It models the spatial structure of the signal through the convolutional layer of the convolutional neural network,which can capture the spatial information characteristics of the signal well.The forward and backward dynamic features of the signal are learned,the temporal dimension features of the signal can be extracted,and the softmax classifier can be used to distinguish the emotional types.(2)The second model is called PTE-GAT,a novel emotion classification model based on structural features of brain network graphs.Compared with the input of model(1),which is an image pixel matrix,the input of this model is a non-euclidean graph structure,and the connection relationship between nodes can be adaptively changed.Therefore,the graph structure can better express the characteristics of EEG signals.This model uses the phase transfer entropy value to measure the interaction between electrode points and the direction of information transmission,uses the PTE value as the edge of the graph structure to construct a directed and weighted emotional brain network.The Graph Attention Network aggregates the features of neighboring nodes,adaptively learns the connection weights between nodes through the attention mechanism,and adds them to the PTE value to capture the interaction relationship between nodes better.Finally,the effectiveness of the algorithm is verified by cross-trial simulation research on the SEED emotion dataset,and the average accuracy rate on 15 subjects reached 87.11%.(3)The third model is PTE-GAT-BiLSTM.In order to analyze the temporal dynamic characteristics of the brain network,a temporal feature extraction algorithm,BiLSTM,is added to the model(2).This model can not only extract the spatial structure characteristics of brain network nodes,but also analyze the temporal dynamic evolution of brain networks.It not only takes into account the spatial and temporal characteristics of brain networks but also simultaneously links the past and future time characteristics of nodes to express information.The average accuracy rate of emotion recognition on the SEED dataset has reached 90.63%,which greatly improves the accuracy of emotion recognition and has good generalization performance.
Keywords/Search Tags:EEG emotion classification, convolutional neural network, bidirectional long short-term memory network, phase transfer entropy, graph attention network
PDF Full Text Request
Related items