Font Size: a A A

A Study On Modality-general Representations Of Emotions Based On Activity Patterns Evoked By Visual And Auditory Modalities

Posted on:2019-11-28Degree:MasterType:Thesis
Country:ChinaCandidate:L J CaoFull Text:PDF
GTID:2404330626452098Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Affective computing enables a computer to have the ability to observe emotions,understand emotions,and generate various emotional features,similar to the human brain and ultimately enabling computers to perform intimate,natural,and vivid interactions like humans.However,we currently have very little understanding of human emotions.Therefore,exploring the neural mechanism of human perception of emotion through cognitive neuroscience is of great significance to the development of affective computing.Emotions can be perceived through the face,body,and whole-person,while previous studies on the abstract representations of emotions only focused on the emotions of the face and body.It remains unclear whether emotions can be represented at an abstract level regardless of all three sensory cues in specific brain regions.In this study,functional magnetic resonance imaging(fMRI)data were collected when participants classified emotions(angry,fearful and happy)expressed by videos of faces,bodies and whole-persons.The whole-brain representational similarity analysis(RSA)revealed an emotion-specific but stimulus category-independent neural representation in the left postcentral gyrus,left inferior parietal lobe(IPL)and right superior temporal sulcus(STS).Further cluster-based multivoxel pattern analysis(MVPA)revealed that only the left postcentral gyrus could successfully distinguish three types of emotions and positive versus negative,when the cross-modal classification analysis was performed.Our study suggested that abstract representations of three emotions could extend from the face and body stimuli to whole-person stimuli in the left postcentral gyrus.Emotions can also be perceived through the auditory modality,and the emotions of visual and auditory modalities can be expressed by different types of stimuli(face,body,voice or music).This research further tested for modality-general representations of valance which are shared across not only visual and auditory modalities,but also different types of stimuli within the modality.The valence state was decomposed into three levels,positive,neutral and negative.In this study,fMRI data were collected when participants classified valance expressed by silent videos of faces,bodies and voices as well as music clips.The pattern correlation analysis revealed modality-general representations of valence in the bilateral postcentral gyrus,middle temporal gyrus(MTG)and middle frontal gyrus(MFG).Further cross-modal MVPA revealed that only the left postcentral gyrus and MTG could successfully distinguish positive versus negative versus neutral.The univariate analysis further confirmed that there were no valance-specific activation differences across modalities in these two brain regions.Our study confirmed that these two clusters were informative to valence representation and these findings extend the previous results regarding the modality-general representations of valance that are independent of different types of stimuli in visual and auditory modalities.
Keywords/Search Tags:Visual and auditory perception, Emotion, Valance, Postcentral gyrus, Functional magnetic resonance imaging, Representational similarity analysis, Pattern correlation analysis
PDF Full Text Request
Related items