The non-verbal expression and perception of emotions rely on amultichannel mode of communication.This emotional information can beperceived from different sources simultaneously,for example facialexpression,body posture and movement,voice prosody and evenphysiological signals.Both emotional body language and facialexpression,as integrated parts of the whole body,contribute to conveyingthe emotional state of the individual.As the ongoing research in"emotional body language''and the understanding ofthe concept of"faceperception and body perception''are underway,the research on theintegration of emotional body language and facial expression is booming.Until now,such research primarily focuses on the behavioral data andearly component of event-related potentials(P1),while the underlyingneural mechanisms are still unknown.Therefore,this project aimed to design an electrophysiologicalexperiment on emotional perception,and applied event-related potentialsand brain mapping methods to explore the electrophysiologicalmechanisms involved during the integration of emotional body languageand facial expression in human observers.The main experimental research included:(1)Design of the emotional perception experimentsA combination of four facial expressions and two bodi ly expressionswas used to create face-body compound stimuli,and with these stimuli the perception experiment executing force-choice facial expressionjudgment task was designed.(2)Verification of the behavioral resultsThe influence of emotional body language on the perception offacial expression from the behavioral response scoring and reaction timewas displayed.(3)Exploration of the electrophysiological mechanismA comparison of the influence of different emotional body languageson the perception of facial expressions was done,by the analysis ofevent-related potential component(N1,N170,P2,LPC)and brainmapping of ERP difference waves.The key conclusions from the experiments include the following fivepoints:1)The behavioral results showed that the judgments to facialexpressions were biased by accompanying emotional body language.Thisindicated that the perception of facial expressions was influenced by theemotional body language.Moreover,this influence was a function of theambiguity of facial expressions:the higher the face ambiguity,the morethe influence of the body language.2)The N170'S amplitude and latency evoked by facial expressions onthe happy body were significantly larger and shorter than that evoked bythose on sad body,indicating the observers were more sensitive to happybody.Hence,the structural encoding specific N170 might be also relatedto the affective processing.3) It was found that the same phenomenon occurred on the P2 as N170,which proved that P2 was a marker component on the integration ofemotional body language and facial expression.4) The brain maDDina of event-related potential difference waves showed that,the integration of emotional body language and facialexpression occurred in the time windows of 100-300ms in thefronto-central region.This result was consistent with our expectation andprevious studies.5)Finally,it was also observed that the"emotion congruence effect"occurred in the time window of 300-800ms.This result verified theprevious conclusion to certain extent.In summary,this project has been a new exploration of facial andbodily emotion cognition research,and provides some new underlyingneural mechanism insights.... |