Font Size: a A A

Human-Assisted Machine Vision for the Visually Impaired

Posted on:2015-07-25Degree:M.SType:Thesis
University:University of Maryland, Baltimore CountyCandidate:Deshpande, Amol NFull Text:PDF
GTID:2478390017494299Subject:Computer Science
Abstract/Summary:
Approximately 285 million people in the world are estimated to be visually impaired. In the increasingly complex urban world, indoor and outdoor navigation has become a difficult task for the visually impaired individuals, especially those who use wheelchairs and walking canes. They have limited travel choices and rely mostly on the pedestrian environment. Sidewalks and pedestrian crossings are important for their daily travel. Despite of having the laws that govern proper standards for accessibility-compatible sidewalks, time accessibility of sidewalks gets damaged over a period of time. Due to accessibility issues on sidewalks, travelling independently becomes difficult for the visually impaired wheelchair and walking cane users and they seek support from navigation systems.;Real time navigation systems for the visually impaired pedestrians, assist them by notifying about any dangers in their path and navigates them around the obstacles. Machine vision based navigation systems lack a priori contextual information which is necessary for detecting obstacles in real time. Also, use of sensors like RADAR and LIDAR for real time obstacle detection increases the power requirements. Thus, having a priori accessibility maps containing geospatial data of accessibility issue locations is helpful for notifying visually impaired individuals in real time mode.;We present WheelNav - a system which uses human-assisted machine vision for developing accurate sidewalk accessibility maps for the navigation of visually impaired individuals. A group of users called Volunteers, crowdsource geotagged images and other relevant information of sidewalk accessibility issues they observe in their city through a smartphone application. Further, computer vision technique called Perspective Transformation is used for identifying the accurate positions of sidewalk accessibility issue objects in crowdsourced images for creating accessibility map. This process is assisted by human workers called Turkers who use Amazon Mechanical Turk and provide feedback about the estimates of real world dimensions of objects in crowdsourced images.
Keywords/Search Tags:Visually impaired, Machine vision, World, Real, Accessibility
Related items