Font Size: a A A

Integrating human-computer interaction with planning for a telerobotic system

Posted on:1998-08-15Degree:Ph.DType:Dissertation
University:University of DelawareCandidate:Kazi, Zunaid HamidFull Text:PDF
GTID:1468390014477667Subject:Computer Science
Abstract/Summary:
Many real-world unstructured tasks, such as manipulation of hazardous materials, space and under-sea exploration, and assistive technology for people with disabilities require human control of telerobots. However, existing interface strategies for telemanipulation under circumstances where direct physical control is limited due to time delay, lack of sensation, and coordination have been generally ineffective. This dissertation addresses the need for a new telemanipulation technique that reduces the requirements of highly coordinated physical control, but does not call upon capabilities for autonomous operation that exceed the current state of the art in Artificial Intelligence, machine vision and robotics.; This dissertation demonstrates that by combining current state of the art in natural language processing, robotics, computer vision, planning, machine learning, and human-computer interaction, building a practical telemanipulative robot without having to solve the major problems in each of these fields is possible. Difficulties involving full text understanding, autonomous robot-arm control, real-time object recognition in an unconstrained environment, planning for all contingencies and levels of problem difficulty, speedy supervised and unsupervised learning, and intelligent human-computer interfaces, illustrate but some of the open issues. Current solutions to each of these problems, when combined with each other and with the intelligence of the user, can compensate for the inadequacies that each solution has individually.; We claim that the symbiosis of the high level cognitive abilities of the human, such as object recognition, high level planning, and event driven reactivity, with the native skills of a robot can result in a human-robot system that will function better than both traditional robotic assistive systems and current autonomous systems. We describe a system that can exploit the low-level machine perceptual and motor skills and excellent AI planning tools that are currently achievable, while allowing the user to concentrate on handling the problems that they are best suited for, namely high-level problem solving, object recognition, error handling and error recovery. By doing so, the cognitive load on the user is decreased, the system becomes more flexible and less fatiguing, and is ultimately a more effective assistant.; A general-purpose telerobotic manipulation aid for people with disabilities was chosen as a test-bed to test the validity of the core ideas of this dissertation. This offers a domain where the physical control of the potential user is less than optimal, and the environment involves known tasks and objects which are used in an inherently unstructured manner. A system, MUSIIC--Multimodal User Supervised Interface and Intelligent Control, was designed and built that uses knowledge-driven planning integrated with a multimodal (speech and gesture) human-machine interface to operate an assistive robot.
Keywords/Search Tags:Planning, Robot, System, Assistive, Human-computer
Related items