Font Size: a A A

Realistic interaction with social robots via facial expressions and neck-eye coordination

Posted on:2016-06-23Degree:M.SType:Thesis
University:The University of Texas at ArlingtonCandidate:Das, Sumit KumarFull Text:PDF
GTID:2478390017977306Subject:Robotics
Abstract/Summary:
In this thesis, we report on research work concerning various aspects of interaction between an android head and its environment. The research aims at building a reusable framework of interaction through facial expressions and vision which can be used for a wide range of humanoid robotic heads. The framework contains specially programmed modules that enable the android head to effectively interact with its environment via facial expressions and neck-eye coordinated motion.;The object detection and tracking module enables the android head to detect and track objects and users through coordinated motions of neck and eye. The module has the capability of using multiple cameras in order to track a target. This helps for building a wider coverage of tracking area. We present the work conducted in implementing and evaluating several controller algorithms to imitate human-like tracking in an android head.;The facial expressions learning and imitation module comprises different methods for generating facial expressions in the android head. This module uses batch training of neural networks and can automatically recalibrate facial expressions for the android, a very tedious and time consuming process. This helps in carrying out realistic conversations with users. During conversation, lip syncing capabilities in an android head is of utmost importance for making it human-like. In the thesis, we also discuss the necessity of having a proper lip sync module in place and the "McGurk effect" to be avoided during conversation.
Keywords/Search Tags:Facial expressions, Android head, Interaction, Module
Related items