Font Size: a A A

Research On Emotion Recognition Of Eye Movement Signals And Electrodermal Activity Signals For Online Learning Scenarios

Posted on:2024-07-29Degree:MasterType:Thesis
Country:ChinaCandidate:Z LiFull Text:PDF
GTID:2557307139458504Subject:Computer technology
Abstract/Summary:PDF Full Text Request
To address the problem that teachers cannot timely understand students’ emotional states during the learning process in online learning scenarios due to the asynchrony and lack of faceto-face communication between teachers and students,which may lead to poor learning outcomes or even dropping out of courses,this paper conducted a bimodal emotional recognition study using eye movement signals and electrodermal activity signals,which are more objective indicators of human emotions compared to speech and facial expressions.Specifically,the study includes the following aspects:(1)A scientifically reasonable data collection experiment was designed based on the relevant theories of psychology and physiology,and recruited 68 college students as experimental subjects to collect data of their eye movement signals and electrodermal activity signals.Based on this,a realistic multimodal emotional dataset in online learning scenarios was established,which includes facial videos,eye movement signals data and electrodermal activity signals data totaling more than 1000 minutes,and the physiological signal data in the dataset were preprocessed for further research.(2)Based on the process of manually extracting the shallow features of physiological signals from the self-built dataset in this paper,and using convolutional neural networks and gated recurrent units to extract deep features of physiological signals,a multi-channel duallayer fusion feature extraction model was proposed.The deep and shallow features extracted from the three channels were fused to obtain a multi-channel dual-layer fusion feature,which was applied to three machine learning classifiers:K-nearest neighbors,decision tree,and random forest to classify four learning emotions:happiness,boredom,confusion,and interest.The results showed that the classification accuracy of random forest reached 90.67%,the highest among the three classifiers,and it performed best in recognizing happiness and worst in recognizing interest.In addition,the results also showed that the recognition accuracy using the multi-channel dual-layer fusion feature extraction model constructed in this paper was improved by 18.39%and 8.36%compared to single-channel and dual-channel,respectively,with significant improvement.(3)Referring to the design ideas of the multi-channel dual-layer fusion feature extraction model proposed in this paper,a three-branch parallel convolutional neural network emotional classification model was proposed,and the multi-channel dual-layer fusion features extracted in the previous section were used as the input of the model to classify four learning emotions.The model was also tested on public datasets.The results show that three-branch parallel convolutional neural network emotional classification model constructed in this paper achieved an accuracy of 92.73%.Compared with traditional machine learning and classical deep learning classifiers,this model can achieve better classification performance with less computing resources.Additionally,it achieved 79.3%and 82.2%classification accuracy on public datasets,indicating good generalization ability.This paper enriched the domestic and foreign physiological signal emotional datasets and provided reference value for emotion computing and online education fields,with both theoretical and practical significance.
Keywords/Search Tags:emotion recognition, eye movement signal, electrodermal activity signal, feature extraction, deep learning
PDF Full Text Request
Related items