Font Size: a A A

Affect measurement by man and machine

Posted on:2010-05-13Degree:Ph.DType:Dissertation
University:Memphis State UniversityCandidate:D'Mello, Sidney KFull Text:PDF
GTID:1445390002472850Subject:Psychology
Abstract/Summary:
The field of affective computing aspires to narrow the communicative gap between the highly expressive human and the socially challenged computer by developing computational systems that recognize and respond to the affective states of the user. This dissertation addresses one of the fundamental goals of affective computing by developing computational mechanisms that automatically detect users' affective states (or emotions) during naturalistic interactions with computer interfaces.This dissertation describes systems that discriminate between the affective states (e.g., boredom, flow/engagement, confusion, frustration, delight, surprise, and neutral) by monitoring conversational cues, gross body language, and facial features. Training and validation data for the affect detectors were obtained from a study in which 28 learners completed a tutorial session with an intelligent tutoring system after which their affective states were judged by the learners themselves, untrained peers, and two trained judges.Affect detectors that automatically classified learners' affective states in real time by monitoring lexical and semantic features from the tutorial dialogue were developed. A feature selection algorithm for situations in which there is ambiguity in the affect categories was developed to select features that were most diagnostic of the affective states and that were expected to generalize above and beyond individual differences.Two algorithms to detect affect from gross body language were developed. The first algorithm tracked the average pressure exerted, along with the magnitude and direction of changes in pressure during emotional experiences. The second algorithm monitored the spatial and temporal properties of naturally occurring pockets of pressure. A hierarchical classification algorithm motivated by the pandemonium model was also developed and evaluated.A feature-level and a decision-level multi-modal affect detector that combined conversational cues, gross body language, and facial features was developed and evaluated. A naive additive algorithm was considered for feature-level fusion, while a spreading activation network architecture with differential weighting was used to model decision-level fusion.The results on human judgments of emotions indicated that (a) boredom, flow/engagement, confusion, and frustration were the major affective states that learners experienced, (b) affect detection accuracies of human judges was low, and (c) trained judges provided more accurate judgments than untrained peers.Machine learning analyses that independently evaluated each sensory channel revealed that (a) monitoring gross body language was most effective for detecting boredom (74%) and flow/engagement (83%), (b) confusion (76%) and delight (90%) were best detected by monitoring facial features, (c) frustration was best detected by examining the dialogue features in the tutoring context (78%), and (d) detection accuracies were 80% when particular emotions were aligned with the optimal sensors.Classification results on combinations of sensory channels indicated that (a) the face was the most diagnostic channel for voluntary affect judgments, while conversational cues were superior for mandatory judgments, (b) combination of channels yielded superadditive effects for some states, but additive, redundant, and inhibitory effects for others, (c) multi-channel models reduced the discrepancy of single-channel models, and (d) decision-level fusion yielded accuracy scores that were equivalent with feature-level fusion.This dissertation concludes that conversational cues and gross body language are serious contenders to existing affect detection methods that focus on facial features and acoustic-prosodic cues. Limitations, solutions, improvements, and extensions of this research are discussed. Finally, a case study that describes an application of the affect detectors developed in this research is presented.
Keywords/Search Tags:Affect, Gross body language, Developed, Facial features, Conversational cues
Related items