Font Size: a A A

Human-Centered Natural Human-Computer Interaction Based On Machine Learning

Posted on:2014-10-29Degree:MasterType:Thesis
Country:ChinaCandidate:F J ZengFull Text:PDF
GTID:2308330482451983Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
Traditional HCI is machine-centered, people have to learn and adapt to a series of rules designed for machines. In order to get rid of the shackles of various rules, natural HCI emerges, which is human-centered, and machines are required to be able to understand and simulate human behaviors. Machine learning is the solid choice of achieving this goal. The process of HCI consists of three important stages, which are data input and output, manipulation of computers by human, and feedback of manipulation by computers, according to the order of occurrence. According to the three important stages, natural HCI approaches based on machine learning have been proposed in this paper, which are described as follows.First of all, traditional information input and output approaches are accomplished through interfaces or strictly predefined actions, which are burdensome and hard to master, they also require users to be somehow "skillful", thus a natural data exchange approach, Sticky, has been designed in this paper. Online semi-supervised learning makes Sticky able to learn user preferences on the fly, then distinguish data transmission trigger actions among various actions users take with machines in daily lives. Experiments show that Sticky has reached high trigger action recognition accuracy.Secondly, existing manipulation with contact takes much human effort, and has limitations in application senarios such as large-size screens. RemoteControl, a non-contact manipulation approach has been proposed with two gestures, "grab" and "put", people can manipulate computers with GUI in the three-dimensional space. SVM and RF are utilized to recognize hand shapes from hand images. Experiment results show that RemoteControl has got high recognition accuracy of manipulation actions.Last, machines are not able to identify users, so existing manipulation feedback cannot meet the requirements of different preferences of different users. Existing identification methods cannot identify users as early as possible by information in hand, therefore a progressively user identification approach, namely SkeFace, have been proposed in this paper. It adopts two nonintrusive biometric features, skeleton and face. When a new feature arises, a classification is carried out, then confidence values of different classifiers are compared. Experiments show that SkeFace can make correct user identifications as early as possible.
Keywords/Search Tags:natural human-computer interaction, machine learning, information input and output, non-contact manipulation, personal identification
PDF Full Text Request
Related items