Font Size: a A A

Research On The Neural Mechanism Of Audiovisual Multisensory Speech Integration

Posted on:2022-02-15Degree:MasterType:Thesis
Country:ChinaCandidate:X J HuangFull Text:PDF
GTID:2480306524491764Subject:Master of Engineering
Abstract/Summary:PDF Full Text Request
Integration of multisensory speech information is crucial for human communication.In recent years,researchers have focused on the audiovisual integration.However,the neural mechanism underlying audiovisual speech integration is not completely clear.With the development of neuroimaging technology,especially the wide application of magnetic resonance imaging technology in brain science,there is a reliable way to investigate audiovisual speech integration.In this study,we combined behavioral measures and magnetic resonance images to explore the neural mechanism underlying audiovisual speech integration from structural and functional level.First,we investigated the interaction between superior temporal gyrus(multisensory region),precentral gyrus(motor cortex),middle temporal gyrus(auditory region)and fusiform gyrus(visual region)during audiovisual speech integration.Note that all regions were selected from left hemisphere.A dynamic causal modeling(DCM)analysis was applied on task functional magnetic resonance imaging(task-f MRI)data from 63 normal adults.Bayesian model selection favored a winning model that included bidirectional endogenous connections between four regions.It demonstrated that both motor cortex and superior temporal gyrus play an important role in audiovisual speech perception.Moreover,we found that motor cortex modulated auditory and visual inputs whereas superior temporal gyrus integrated audiovisual information during audiovisual speech perception.In particular,motor cortex promotes auditory information inputs and inhibits visual information inputs.As well as more auditory information but less visual information to be integrated in superior temporal gyrus.In addition,the connectivity between motor cortex and superior temporal gyrus is correlated with individual behavioral performance in audiovisual speech integration.Second,we implemented a voxel-based morphometry(VBM)analysis on structural magnetic resonance imaging data to detect the individual morphological difference.Interesting,we found less gray matter volume of left precentral gyrus in Mc Gurk strong perceivers than weak perceivers.Besides,the gray matter volume of precentral gyrus is correlated with individual Mc Gurk susceptibility.To further examine the group difference of resting-state functional connectivity,we subsequently performed a functional connectivity analysis based on left precentral gyrus.The results showed more connectivity between precentral gyrus and visual regions in Mc Gurk strong perceivers than weak perceivers.These connectivity is significant correlated with individual Mc Gurk susceptibility.These findings confirm that motor cortex is a critical site for audiovisual speech integration.In conclusion,this study may reveal distinct functional role of motor cortex and superior temporal gyrus during audiovisual speech perception: modulation in motor cortex and integration in superior temporal gyrus.These findings highlight the importance of motor cortex,advancing the foundational understanding of the neural mechanism underlying audiovisual speech perception.
Keywords/Search Tags:audiovisual multisensory speech integration, dynamic causal modeling, voxel-based morphometry, resting-state functional connectivity
PDF Full Text Request
Related items