Font Size: a A A

Kinect movements for navigating a virtual exercise environment by people with mobility impairments

Posted on:2016-03-10Degree:M.B.EType:Thesis
University:The University of Alabama at BirminghamCandidate:Pool, SeanFull Text:PDF
GTID:2478390017477960Subject:Biomedical engineering
Abstract/Summary:
People with physical disabilities exhibit decreased physical activity compared to able-bodied individuals. While physical activity plays an integral role in the health of any individual, people with disabilities have difficulty attaining the quality and quantity of recommended exercise. A recent technique for encouraging physical activity in able-bodied individuals is to combine exercise equipment with a virtual exercise environment; as the user exercises, a video screen portrays the user moving through a virtual recreation of a real-world location. However, little has been done to adapt this technology, specifically the interface between user and environment, for people with mobility impairments. This project aims to develop a universal interface for a virtual exercise environment, utilizing the Microsoft Kinect to allow the user to choose between multiple routes within the virtual environment by performing body movements or speaking voice commands. Furthermore, this project aims to test these interactions with people with mobility impairments and to create population-specific interaction libraries. We conducted a trial at the Lakeshore Foundation to test the virtual environment interface and its associated Kinect interactions. We recruited fifteen individuals post-stroke and fifteen individuals with cerebral palsy (CP) to perform the fifteen interactions. These interactions were evaluated within- and between-subjects according to success rate, preparation time, performance time, and questionnaire feedback. The success rate analysis used generalized estimating equations with binomial logistic regression and the movement time analyses used mixed model testing, pooling successful interactions. All movements were viable for individuals post-stroke. Only hand extend, hand raise, and head nod were viable for individuals with cerebral palsy. Voice commands were viable for the participants post-stroke but require further refinement for use by the participants with cerebral palsy. Interaction libraries were developed for each population based on these results. In the future, we aim to expand the number of interactions with the Kinect and to develop interaction libraries for other groups of people with mobility impairments. Ultimately, this work develops an interface for interaction with a virtual exercise environment for people with mobility impairments.
Keywords/Search Tags:People with mobility impairments, Virtual exercise environment, Physical activity, Individuals, Kinect, Movements, Interface, Interaction
Related items