Font Size: a A A

Emotional Analysis And Prediction Method Based On ECG Signals

Posted on:2022-03-16Degree:MasterType:Thesis
Country:ChinaCandidate:Y ZhuFull Text:PDF
GTID:2505306551453494Subject:Master of Engineering
Abstract/Summary:PDF Full Text Request
Affective computing is a vital method for machine to perceive,understand and express human emotions,it’s also a key technology for advanced human-computer interaction.With the development of human-computer interaction,emotional state recognition has been widely concerned.The recognition of emotional state contributes to the development of human-computer interaction as well as the understanding of human emotions.The accurate understanding of human emotional state makes the machine with emotional attributes,making "artificial intelligence" more anthropomorphic.At present,there are still great challenges for machines to understand complex emotions without human intervention.The generation and expression of human emotions are multimodal and the way to identify human emotion mainly includes human external performances and physiological responses.In fact,the physiological response is more authentic than the external expression of human emotions.Experiments have proved that it is feasible and efficient to recognize emotional states through Electrocardiogram(ECG)signals.ECG signals are easy to collect and widely used,so it is of practical significance to study the method of identifying emotions through ECG signals.At present,some experts and scholars have done some useful work and made a lot of progresses in the field of emotion recognition from ECG signals.However,the existing methods still have some limitations in feature extraction,feature fusion,cross-modal fusion and other subdivisions.Therefore,aiming at the practical needs of ECG emotion recognition and the existing limitations of existing work,this thesis conducts research on the ECG emotion recognition method based on machine learning and deep learning.This thesis proposes a multi-modal emotion recognition method based on ECG signals and facial videos.This method first proposes a global emotional feature extraction algorithm according to the different feature dimensions of ECG signals.In order to integrate ECG signals and facial videos to better recognize authentic emotion,an Emo Tri-Net network suitable for measuring the distance between different facial expressions is designed,and the keyframes of facial video emotions are extracted.The key segments of ECG signals can obtain through the keyframes.After that,different feature extraction methods are used for key segments,including manual feature calculation of three dimensions,convolutional neural network and semi-supervised learning method.In the end,we use machine learning methods to classify emotional states based on extracted features.All experiments are conducted on large-scale public data sets.The experimental results show that the ECG signal features extracted by the proposed method have a strong correlation with the emotional state,and the combination of the key segment recognition technology of the ECG effectively improves the accuracy of emotion recognition.
Keywords/Search Tags:ECG Signals, Affective Computing, Emotional State Recognition, Facial Expression Recognition, Multimodal
PDF Full Text Request
Related items