Font Size: a A A

Research On Multimodal 3D Interaction And Its Application In The System Of City Planning

Posted on:2010-03-06Degree:MasterType:Thesis
Country:ChinaCandidate:Y Y HeFull Text:PDF
GTID:2178360272994064Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
As a new computing means, Virtual Reality (VR) has an increasingly wide range of applications in main fields such as economics, society and military affairs. VR emphasize real-time interaction, so natural and efficient modes of human-computer interaction are one of the crucial contents for VR research. However, the interactions problems in virtual environment (VE), such as being unnatural, singularity of interface control means and existing gap between user actions and feedback, have become a "bottleneck" restriction on the further applications of VR. The key to solve this problem is to fully utilize various perceptive modalities, parallelism of control behaviors and expand I/O bandwidth, improve the naturalness and flexibility of interaction, that is, to resolve it depending on multimodal interaction (MMI). Introducing MMI into 3D interaction, to study MMI information integration and display technologies in VE has gradually become a focus of interaction technology research.City planning is one of the areas which are in the most urgent requirement of VR, but there are problems, including high cost for modification and being hard to interact, in current modes. In this paper, after the systematical introduction of 3D interaction, deep discussion and research on aspects of multimodel 3D interaction, especially including advantages and supporting technologies and design methods, are done. With reference to the existing task-oriented integration approach, a hierarchical multimodal integration method based on semantics considering probability and constraints of time is put forward. On the backgroud of city planning application, introducing daily operations of pen and paper into VE, combining the information from spatial position trackers, voice and pen-based sketch and gestures, the Personal Interaction Panel (PIP) based on multiple metaphors was constructed. Fully analyzing of multiple factors, for example, the precise positioning ability of trackers, the descriptive characteristic of voice, the natural characteristic of pen-based sketch and gestures, design and implement a variety of common or application-oriented interactive technologies, including multi-dimensions navigation, ray-casting selection and Aperture-Based Select (ABS) based on trackers and voice, multi-objects dynamic group and shape/area arrange (MSAA) based on pen-based 2D sketch and voice, world in miniature (WIM) and 3D Widget, and so on. Evaluation showed that multimodal 3D interaction technologies significantly enhance naturalness and efficiency of interactions in VE.
Keywords/Search Tags:Virtual Reality, Multimodal 3D interaction, Multimodal integration, PIP, City planning
PDF Full Text Request
Related items