Font Size: a A A

Research On Facial Behavior Modeling Method And Application For Depression Recognition

Posted on:2023-07-17Degree:DoctorType:Dissertation
Country:ChinaCandidate:M Q YangFull Text:PDF
GTID:1524306782975459Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Depression is a common affective disorder.At present,the screening and diagnosis of depression mainly rely on subjective reports of patients and clinical interviews with psychiatrists,which are subjective and highly dependent on the professional ability and clinical experience of doctors.Professional mental health resources are scarce and unevenly distributed,resulting in a large number of delayed or misdiagnosed patients who miss better opportunities for intervention.Due to the complex causes and manifestations of depression,the behavioral characteristics of patients vary greatly,and the behavioral characteristics that can be clearly defined as biomarkers of depression remain to be explored.Non-verbal behaviors in facial areas such as eye movements and expressions contain rich emotional information,which is an important way for people to communicate information and emotions,and provide a lot of clues for the diagnosis and analysis of depression.Facial behavior data has the advantages of intuitive presentation and easy data acquisition.However,it is difficult to find eye movement behavior patterns in eye movement research,and it is difficult to extract features that can effectively represent depression in expression data.In response to these problems,this thesis explores fixation identification algorithms that can enrich the expression of eye movement behavior patterns,and explores effective features that can reflect weak facial expression changes under passive emotional stimuli.The main work and innovations of this thesis are as follows:(1)In order to effectively describe the eye movement fixation patterns,this thesis proposes a fixation identification algorithm based on the spatiotemporal clustering algorithm,namely I-SO.The algorithm considers the autocorrelation of eye movement data in time series,and introduces the local reachable distance to describe the variation range of sight in local time,so as to realize the accurate identification of fixation events.And further utilizes the hierarchical property of local reachable distance to establish a FS-BTree structure expressing fixation behavior,which can express eye movement behavior patterns at different spatial resolutions.The fixation behavior features based on the FS-BTree structure are used for classification modeling,and the experimental results show that the fixation behavior features based on this structure can more effectively reflect the differences between the subjects’ fixation patterns.(2)In order to solve the problem of insufficient representation of facial expression changes under the passive stimulus paradigm by existing features,this thesis proposes a facial motion feature extraction method TOFS based on sparse optical flow.TOFS computes facial changes under emotional stimuli through Main Directional Mean Optical(MDMO)and temporal statistical features of key Regions of Interest in the face.The comparison experiments with other classical features and different classification algorithms show that the features extracted by TOFS can effectively reflect the facial changes of depression and have better depression recognition ability.In addition,the facial optical flow of depressed subjects showed consistent fluctuations in time sequence and corresponded to the episodes of negative emotional stimulation videos.This finding is of great significance for the study of facial changes in depressed groups under emotional stimulation.(3)In order to make full use of the temporal and spatial correlation of the two modal data,eye movement and expression,to improve the performance of the model for depression recognition.In this thesis,an eye movement and expression fusion modeling method based on sparse canonical correlation analysis is proposed,and the fusion modeling effect based on different temporal granularity features is studied and compared.In the expression of temporal properties,this thesis exploits the inherent window properties of the experimental paradigm and the temporal sequence of facial optical flow.The experimental results show that the accuracy of depression recognition can be significantly improved by the fusion of FS-BTree-based fixation behavior features and MDMO features.(4)In order to realize the synchronous acquisition of eye movements and expressions,this thesis designs and develops a wearable eye tracker and a multi-modal data synchronous acquisition system,which allows the synchronous acquisition of highprecision eye movements and expressions.The wearable eye tracker avoids occlusion of key areas of the face,and allows subjects to wear glasses.Some performance parameters are better than the mainstream portable wearable eye trackers in the market.The real-time synchronization acquisition scheme proposed in this thesis does not need to be supported by the hardware synchronization of the data acquisition sensor.It can achieve microsecond-level time synchronization only through software transformation,and can be used in other multi-modal data acquisition scenarios that require high-precision time synchronization.This thesis proposes the I-SO fixation identification algorithm and the FS-BTree fixation behavior expression structure,which can extract the features of eye movement behavior at different spatial resolutions and achieve the purpose of effectively describing the eye movement behavior patterns.A TOFS feature extraction method is proposed,which solves the problem of insufficient representation of facial expression changes under passive stimulation paradigm by existing features,and has better depression recognition ability.A fusion modeling method of eye movement and expression based on sparse canonical correlation analysis is proposed,and the temporal correlation between the two data modalities is used to obtain a higher classification accuracy.In addition,a wearable eye tracker and a hard real-time synchronous acquisition system are developed,which are used for the acquisition of experimental data and the application test of the depression recognition prototype system.The research results of this thesis provide relevant method basis and technical support for universal depression recognition based on facial behavior.
Keywords/Search Tags:Affective Computing, Depression Recognition, Facial Behavior, Eye Movements, Expressions
PDF Full Text Request
Related items