Font Size: a A A

Multimodal Perception-Driven Human-Robot Interaction Technology And Application

Posted on:2020-12-29Degree:MasterType:Thesis
Country:ChinaCandidate:A J ZhuFull Text:PDF
GTID:2428330599976478Subject:Computer technology
Abstract/Summary:PDF Full Text Request
The existing sensing channels and interaction modes of human-robot interactions are simple and limited,which cannot meet the requirements of complex and varied interaction,and reduce the user experience during the interaction.With the popularization of service robots,in order to achieve efficient collaborative interaction between humans and robots,robots are required to have strong perceptual capabilities and flexible interaction capabilities.Therefore,this thesis studied the human-robot interaction technology based on motion-aware rehabilitation training robot,and integrated multimodal data,such as EEG and eye movement data,to promote the robot's recognition of user intention and physiological state,and dynamically adjusted the robot's interaction modes.The proposed method improved the rehabilitation training effect and user experience.The main research work includes the following aspects:(1)Robot-based EEG perception and interaction technology.Firstly,three robot-based stimulation modes were designed,which combined the limbs and voice to guide the user to exercise motion imagination.Through time-frequency analysis,it is found that the multi-mode stimulation of the robot influenced on the event-related de-synchronization(ERD)in the EEG signal.Compared with the traditional picture and video stimulation modes,the average ERD value of this method increased by 96.87% and 147.28% in the ?-band and ?-band,respectively.In the classification of EEG signals,the average classification recognition rate of the three stimulation modes based on robots reached 82.32%,which was 7.89% and 4.32% higher than that of the picture and video stimulation modes,respectively.Therefore,multi-mode stimulation of the robot can generate more obvious ERD characteristics and improved the quality of EEG signals for motor imagery.Furthermore,in order to alleviate the user's fatigue during training,a fatigue detection method based on the EEG signal was proposed,and the characteristic frequency band in the EEG signal was extracted to calculate the fatigue value of the user.Then,a model of cross-stimulation was designed,and the experimental results showed that the user's fatigue degree reduced by 15.93%.(2)Robot-based eye movement perception and interaction technology.In order to ensure the quality of motion imagination training,users need to stay focused.Eye tracking method based on pupil center cornea reflection(PCCR)was used.By extracting the user's gaze fixation invalid ratio,total saccade length,and average saccade velocity,calculated the user's concentration value in the robot assisted training process in real time.The experimental results showed that there was significant difference in the concentration values of users with different levels of concentration.When the system detected that the user's concentration was lower than the predefined threshold,the robot would prompt the user to keep concentration by audio notification.(3)Human-robot interaction application based on multimodal perception.To validate the method proposed by this thesis,designed and developed a prototype system based on robot-assisted rehabilitation training.The robot guided the user to exercise imagination through limb movement and audio,while the EEG and eye tracker collected the data to detect the user's concentration and fatigue in real time,and gave the user dynamic interactive feedback,so as to improve the training outcome.The results of user study showed that after the motion imaging training with the prototype system,the average accuracy of EEG signal classification reached 82.64%,and the ERD values of EEG signals in the ?-band and ?-band increased by 109.3% and 124%,respectively,which validated that the system had good training effect.
Keywords/Search Tags:brain-computer interaction, eye tracking, human-robot interaction, motor imagery, multimodal perception
PDF Full Text Request
Related items