Font Size: a A A

Recalibration In Virtual Sound Localization Via Audiovisual Interactive Training

Posted on:2016-09-03Degree:MasterType:Thesis
Country:ChinaCandidate:J ZhangFull Text:PDF
GTID:2308330479494322Subject:Acoustics
Abstract/Summary:PDF Full Text Request
In virtual auditory display, it is necessary to use the listener’s own head related transferfunction(individual HRTF) for signal processing to get the best localization of virtual sound.However, measuring or calculating individual HRTF is complex and time consuming, soactual virtual auditory display applications tend to use the non-individual HRTF. Some studiesshowed that virtual auditory display with non-individual HRTF could lead to localizationdistortion. On the other side, the brain’s visual perception and auditory perception influenceeach other, and visual perception can affect perception of sound localization. Therefore, inorder to improve the accuracy of sound localization in virtual auditory display withnon-individual HRTF, this study uses audiovisual interactive training to recalibrate thelocalization distortion(including the accuracy of sound localization, the increasing offront-back confusion and so on) in virtual auditory display with non-individual HRTF toimprove the perception effect in sound localization of existing actual virtual auditory displayapplications.First of all, this study designed and implemented a virtual auditory display trainingplatform based on audiovisual interactive training. The training platform includes a personalcomputer, external sound card, microcontroller, space coordinate system, LED lights,headphones and other hardware parts. The training platform uses Matlab software in the PC tocontrol the sound card as well as the microcontroller, so as to play visual signal and soundsignal orderly.Based on the training platform, we carried out localization training experiment in thehorizontal plane and median plane respectively. Results in the horizontal plane show that afterthree-day audiovisual training, participants can significantly improve the localizationaccuracy of azimuth and reduce front-back confusion in virtual auditory display withnon-individual HRTF; however, the training experiment results in the median plane show thatthree-day audiovisual training can effectively decrease front-back confusion, but notsignificantly improve the localization accuracy of elevation. Furthermore, we makecooperation with department of otolaryngology, Sun Yat-Sen Memorial Hospital and instituteof hearing and speech-language science, Sun Yat-Sen University to measure the ERP brainwave of subjects before and after audiovisual training, in order to make a preliminarydiscussion on the change of brain wave characteristics caused by audiovisual training. Inaddition, by comparing the spectrum of subjects’ individual HRTF and non-individual HRTF,we discuss the reason why the improvement of localization accuracy of elevation isn’tobvious after three-day audiovisual training, then we point out the elevation recalibrationneed a long time training.This research helps understand audiovisual interactive training on the impact of humansound localization, and develops a virtual auditory display training platform based onaudiovisual interactive training. This platform can be directly embedded into the existingvirtual sound display products to improve sound localization effects.
Keywords/Search Tags:head related transfer function, audiovisual interactive training, sound localization, recalibration
PDF Full Text Request
Related items