| Emotion is one of the most important characteristics of human beings.Through in-depth study of emotion classification,it is helpful to promote the development of intelligent and is beneficial for human-computer interaction.Also,it provides potential and convenience for the diagnosis and treatment of patients’ psychological disorders in psychiatry.Virtual reality(VR)is a superior emotion-inducing element because of its stereoscopic and realistic environment,allowing users to enjoy the immersive and intuitive feeling.At the same time,in the aspect of emotion classification research,physiological signals such as EEG and eye movement are more used by virtue of their authenticity and intuition.Therefore,this thesis will focus on the emotion classification based on forehead EEG and eye movement in the virtual reality visual evoked state.According to the survey,relevant researches have following problems: 1)The physiological signal dataset with emotional label in the virtual reality visual evoked state is almost blank,which is not conducive to the research of the algorithm.2)In the research of emotion classification based on EEG,compared with the traditional methods,the research of emotion classification algorithm based on two frontal EEG signals is very scarce,which brings great inconvenience to operation and application.At the same time,because of the particularity of VR,the collected EEG signals have lots of noise due to head movement and equipment current,while the EEG denoising algorithm based on VR is vacant.3)There is a great uncertainty in the relationship between eye movement position data and emotion.Based on the above problems,this paper mainly carried out the following works: 1)Creat a VR image library containing 36 three different emotions with international standards,and build an automatic interactive VR visual evoked experimental environment.Through 24 subjects participating in the experiment,EEG and eye movement datasets with emotional tags are created.2)In the emotion classification task based on forehead EEG,based on the DEAP database,an emotion classification algorithm based on temporal,spatial and frequency domain features of forehead two-channels EEG is proposed.The average classification rate reaches 75.18%,which is as well as traditional 32-channels method.It reduces the number of signal channels to be processed and the feature extraction calculation.Based on the VR vision-induced EEG dataset,two kinds of denoising algorithm based on denoising autoencoder(DAE)are proposed,which greatly remove the noise caused by head motion and VR equipment.With the DAE+GBDT fusion algorithm model,the average recognition rate of 76.88% is obtained,and the effectiveness and universality of classification algorithm is verified.3)In the analysis of eye movement signals,based on the RNN model,the relationship between eye movement position signal and emotion is initially explored,and individual differences are found.It indicates the direction for further research.The works in this paper lay a solid foundation for the creation of the physiological label database with emotional label in the virtual reality induced state,and provide an implementable scheme for the denoising treatment of EEG based on VR.The classification study provides a powerful reference for emotion classification research based on forehead two-channels EEG,which is beneficial to the development of wearable emotional detection devices.At the same time,a positive and beneficial exploration is carried out on the research of emotion classification based on eye movement data. |