Font Size: a A A

Facial Expression Recognition Based On Muscle Movement

Posted on:2013-02-04Degree:DoctorType:Dissertation
Country:ChinaCandidate:J Z LiuFull Text:PDF
GTID:1228330392952430Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Affective computing is an important field which computer science will focus onin the furture. It means making computers understand human feelings and emotions,and interact with people by having feelings of human. The simplest and most directway of the research is to begin with the analysis of human facial expression.Human facial expression is the formation of facial muscle movement. To studyfacial expressions of human, we should start with the research of specific movementof the human facial muscle, and focus on the movement itself. So we proposed amethod to research human facial expressions based on facial muscle movement. Ourwork is based on Ekman’s FACS (Facial Action Coding System). Psychologists’ studyhad demonstrated that human facial expressions correspond to a fixed form of musclemovement which is not subject to age, gender, race, education or other factors. In thisarticle, we first briefly introduced a number of Action Unit associated with the basicfacial expressions. Based on these Action Units, we conducted the following researchworks:1. We propose a approach which can recognize facial feature points quickly andautomatically. A lot of current researchs are based on marked the facial feature pointsmanually. But the process itself that hand-labeled facial feature points would join thesubjective judgments of current expressions by the researcher. And others method thatcan identify facial feature points automatically is too slow, so it can not meet therequirements of real-time analysis system. Our proposed approach is to conduct ascreening of the ROI (region of interest) firstly in order to select all possible facialfeature points which contain rich texture informations. Then these points are filteredout to find the real facial feature points. Experiment proved that the method have highrecognition rate, and it can meet the needs of a real-time system.2. We proposed an approach that can recognition Action Units based on motiontemplate. Most of the AUs have neither obvisous feature points, or the feature pointsof the AU is difficult to identify and track. Motion template research on movementitself and it’s history. Using the method, we can identify the movement of AUs weinterest accurately. Using the Boosting algorithm, we trained several classifier toidentify AUs we interested to recognize facial expression, and these classifier workedvery well. 3. We proposed a new method of indentifying the head movenent. Traditionalmethods identify human head movenent by recognizing and tracking the movement ofthe eye area. However, it is a great amount of work to identify the eye area on humanface, and it needs a large number of operations or some special device support inorder to identify and track eye accurately. We proposed that instead of eye, we canlocate and track nostrils to recognize the movement of head. Compared with eye, it iseasier to identify and track nostrils. Not only can our approach identify nod and shake,but also recognize bow and turn face aside.4. We designed a real-time system to identify facial expressions, and tried todivide the level of expression on the identified expression:"happy". In this article wehad made a very good way to identify the movement of facial muscles accurately, andwe put these identified movement into a BP (backpropagation) neural network torecognize expressions, and the network’s output could classify expressions accurately.We also adopt the Fuzzy theory, and measured the degree of expression on arecognized "happy" by analyzing the MHI (motion history image), and has achieved asatisfactory result.
Keywords/Search Tags:affective computing, facial expression, head movement, featurepoint, motion template, artificial neural networks, fuzzy
PDF Full Text Request
Related items