Indoor Navigation Control System for Visually Impaired People (original) (raw)
Related papers
A Kinect Based Indoor Navigation System for the Blind
2015
Title of thesis: A KINECT BASED INDOOR NAVIGATION SYSTEM FOR THE BLIND Alexander Belov, Guodong Fu, Janani Gururam, Francis Hackenburg, Yael Osman, John Purtilo, Nicholas Rossomando, Ryan Sawyer, Ryan Shih, Emily True, Agnes Varghese, Yolanda Zhang Thesis directed by: Professor Rama Chellappa Chair, Department of Electrical and Computer Engineering Team NAVIGATE aims to create a robust, portable navigational aid for the blind. Our prototype uses depth data from the Microsoft Kinect to perform realtime obstacle avoidance in unfamiliar indoor environments. The device augments the white cane by performing two significant functions: detecting overhanging objects and identifying stairs. Based on interviews with blind individuals, we found a combined audio and haptic feedback system best for communicating environmental information. Our prototype uses vibration motors to indicate the presence of an obstacle and an auditory command to alert the user to stairs ahead. Through multiple trials ...
International Journal of Computer and Electrical Engineering, 2014
This paper describes the development of a wearable navigation aid for blind and visually impaired persons to facilitate their movement in unfamiliar indoor environments. It comprises of a Kinect unit, a Tablet PC, a microcontroller, IMU sensors, and vibration actuators. It minimizes reliance on audio instructions for avoiding obstacles and instead guides the blind through gentle vibrations produced in a wearable belt and a light helmet. By differentiating obstacles from the floor, it can detect even relatively small-size obstacles. It can also guide the blind to reach a desired destination (office/room/elevator) within an unfamiliar building with the help of 2-D printed codes, RGB camera of Kinect unit, a compass sensor for orienting the user towards the next direction of movement, and synthesized audio instructions. The developed navigation system has been successfully tested by both blindfolded and blind persons.
A Navigational Aid System for Visually Impaired Using Microsoft Kinect
2014
Numerous attempts have been made to devise systems that make the work of a visually impaired person easier. These researches have focused on a number of issues such as path finding, obstruction detection, face recognition, sign recognition, to name a few. The aim of this paper is to outline a system, based on Microsoft Kinect that will provide some of these features in a unified manner. The system is based on a number of open source tools such as: OpenCV, OpenKinect, Tesseract and Espeak. Features that have been incorporated building this aiding tool are object detection and recognition, face detection and recognition, object location determination, optical character recognition and audio feedback. One of the key components of this research is to ensure considerable amount of accuracy and at the same time be extremely efficient in terms of hardware resource required. Since the system is an aggregation of multiple components, their accuracies are measured independently from online an...
2011
We present a proof-of-concept of a mobile navigational aid that uses the Microsoft Kinect and optical marker tracking to help visually impaired people find their way inside buildings. The system is the result of a student project and is entirely based on low-cost hard-and software. It provides continuous vibrotactile feedback on the person's waist, to give an impression of the environment and to warn about obstacles. Furthermore, optical markers can be used to tag points of interest within the building to enable synthesized voice instructions for point-to-point navigation.
In this paper, the realization of a new kind of autonomous navigation aid is presented. The prototype, called AudiNect, is mainly developed as an aid for visually impaired people, though a larger range of applications is also possible. The AudiNect prototype is based on the Kinect device for Xbox 360. On the basis of the Kinect output data, proper acoustic feedback is generated, so that useful depth information from 3D frontal scene can be easily developed and acquired. To this purpose, a number of basic problems have been analyzed, in relation to visually impaired people orientation and movement, through both actual experimentations and a careful literature research in the field. Quite satisfactory results have been reached and discussed, on the basis of proper tests on blindfolded sighted individuals.
A Survey on Assistance System for Visually Impaired People for Indoor Navigation
2018
In the past few years, the variety of wearable devices and remote processing systems developed for assisting visually impaired people for indoor navigation as well as for outdoor navigation. The main aim of such assisting wearable devices, smart phone applications and remote processing units is to find out a plausible path in the real-time environment for secure navigation. To sense the surrounding environment by applying techniques of computer vision domain such as visual information processing by using image processing methodology is one of the challenging task. This paper reviews various types of assisting systems developed for vision less people for indoor navigation.
Blind Navigation Support System based on Microsoft Kinect
2012
This paper presents a system which extends the use of the traditional white cane by the blind for navigation purposes in indoor environments. Depth data of the scene in front of the user is acquired using the Microsoft Kinect sensor which is then mapped into a pattern representation. Using neural networks, the proposed system uses this information to extract relevant features from the scene, enabling the detection of possible obstacles along the way. The results show that the neural network is able to correctly classify the type of pattern presented as input.
An Indoor Navigation System for the Visually Impaired
Sensors, 2012
Navigation in indoor environments is highly challenging for the severely visually impaired, particularly in spaces visited for the first time. Several solutions have been proposed to deal with this challenge. Although some of them have shown to be useful in real scenarios, they involve an important deployment effort or use artifacts that are not natural for blind users. This paper presents an indoor navigation system that was designed taking into consideration usability as the quality requirement to be maximized. This solution enables one to identify the position of a person and calculates the velocity and direction of his movements. Using this information, the system determines the user's trajectory, locates possible obstacles in that route, and offers navigation information to the user. The solution has been evaluated using two experimental scenarios. Although the results are still not enough to provide strong conclusions, they indicate that the system is suitable to guide visually impaired people through an unknown built environment.
Mobile Assistive Application for Blind People in Indoor Navigation
Lecture Notes in Computer Science, 2020
Navigation is an important human task that needs the human sense of vision. In this context, recent technologies developments provide technical assistance to support the visually impaired in their daily tasks and improve their quality of life. In this paper, we present a mobile assistive application called "GuiderMoi" that retrieves information about directions using color targets and identifies the next orientation for the visually impaired. In order to avoid the failure in detection and the inaccurate tracking caused by the mobile camera, the proposed method based on the CamShift algorithm aims to introduce better location and identification of color targets. Tests were conduct in natural indoor scene. The results depending on the distance and the angle of view, defined the accurate values to have a highest rate of target recognition. This work has perspectives for this such as implicating the augmented reality and the intelligent navigation based on machine learning and real-time processing.
An indoor navigation system to support the visually impaired
2008
Indoor navigation technology is needed to support seamless mobility for the visually impaired. A small portable personal navigation device that provides current position, useful contextual wayfinding information about the indoor environment and directions to a destination would greatly improve access and independence for people with low vision. This paper describes the construction of such a device which utilizes a commercial Ultra-Wideband (UWB) asset tracking system to support real-time location and navigation information. Human trials were conducted to assess the efficacy of the system by comparing target-finding performance between blindfolded subjects using the navigation system for real-time guidance, and blindfolded subjects who only received speech information about their local surrounds but no route guidance information (similar to that available from a long cane or guide dog). A normal vision control condition was also run. The time and distance traveled was measured in each trial and a point-back test was performed after goal completion to assess cognitive map development. Statistically significant differences were observed between the three conditions in time and distance traveled; with the navigation system and the visual condition yielding the best results, and the navigation system dramatically outperforming the non-guided condition.