Towards a navigation system for blind people (original) (raw)
Related papers
Assistive Technology, 2020
This paper describes the interface and testing of an indoor navigation app-ASSIST that guides blind & visually impaired (BVI) individuals through an indoor environment with high accuracy while augmenting their understanding of the surrounding environment. ASSIST features personalized interfaces by considering the unique experiences that BVI individuals have in indoor wayfinding and offers multiple levels of multimodal feedback. After an overview of the technical approach and implementation of the first
Journal of Visual Impairment & Blindness
Introduction: This article describes an evaluation of MagNav, a speech-based, infrastructure-free indoor navigation system. The research was conducted in the Mall of America, the largest shopping mall in the United States, to empirically investigate the impact of memory load on route-guidance performance. Method: Twelve participants who are blind and 12 age-matched sighted controls participated in the study. Comparisons are made for route-guidance performance between use of updated, real-time route instructions (system-aided condition) and a system-unaided (memory-based condition) where the same instructions were only provided in advance of route travel. The sighted controls (who navigated under typical visual perception but used the system for route guidance) represent a best case comparison benchmark with the blind participants who used the system. Results: Results across all three test measures provide compelling behavioral evidence that blind navigators receiving real-time verba...
Accessible smartphones for blind users: A case study for a wayfinding system
While progress on assistive technologies have been made, some blind users still face several problems opening and using basic functionalities when interacting with touch interfaces. Sometimes, people with visual impairments may also have problems navigating autonomously, without personal assistance, especially in unknown environments. This paper presents a complete solution to manage the basic functions of a smartphone and to guide users using a wayfinding application. This way, a blind user could go to work from his home in an autonomous way using an adaptable wayfinding application on his smartphone. The wayfinding application combines text, map, auditory and tactile feedback for providing the information. Eighteen visually impaired users tested the application. Preliminary results from this study show that blind people and limited vision users can effectively use the wayfinding application without help. The evaluation also confirms the usefulness of extending the vibration feedback to convey distance information as well as directional information. The validation was successful for iOS and Android devices.
Use of an Indoor Navigation System by Sighted and Blind Travelers
ACM Transactions on Accessible Computing, 2020
This article first reviews the pros and cons of current accessible indoor navigation systems and then describes a study using commercial smart devices to navigate routes through a complex building. Our interest was in comparing performance when using real-time narrative descriptions (system-aided condition) vs. a memory-based condition where the same narrative information was only provided to users from the route's origin. We tested two groups of blind and visually impaired (BVI) users, including people above and below 60 years of age, as well as a third sighted control group. Evaluating older BVI participants is important, as the majority of vision loss is age-related, yet navigation performance using access technology is rarely studied with this demographic. Behavioral results demonstrated that access to real-time (system-aided) information led to better navigation accuracy and greater confidence by blind users compared to the information-matched memory condition. Performance ...
Proceedings of the …, 1996
This paper presents a new travel aid to increase the independent mobility of blind and elderly travellers. This aid builds on the technologies of geographical information systems (GIS) and the Global Positioning System (GPS). The MoBIC Travel Aid (MoTA) consists of two interrelated components: the MoBIC Pre-journey System (MOPS) to assist users in planning journeys and the MoBIC Outdoor System (MOODS) to execute these plans by providing users with orientation and navigation assistance during journeys.
Auditory guidance with the Navbelt-a computerized travel aid for the blind
IEEE Transactions on Systems, Man and Cybernetics, Part C (Applications and Reviews), 1998
A blind traveler walking through an unfamiliar environment, and a mobile robot navigating through a cluttered environment have an important feature in common: both have the kinematic ability to perform the motion, but are depended on a sensory system to detect and avoid obstacles. This paper describes the use of a mobile robot obstacle avoidance system as a guidance device for blind and visually impaired people. Just as electronic signals are sent to a mobile robot's motor controllers, auditory signals can guide the blind traveler around obstacles, or alternatively, they can provide an "acoustic image" of the surroundings. The concept has been implemented and tested in a new travel aid for the blind, called the Navbelt. The Navbelt introduces two new concepts to electronic travel aids for the blind: it provides information not only about obstacles along the traveled path, but also assists the user in selecting the preferred travel path. In addition, the level of assistance can be automatically adjusted according to changes in the environment and the user's needs and capabilities Experimental results conducted with the Navbelt simulator and a portable experimental prototype are presented.
ACM Transactions on Accessible Computing (TACCESS), 2020
This article first reviews the pros and cons of current accessible indoor navigation systems and then describes a study using commercial smart devices to navigate routes through a complex building. Our interest was in comparing performance when using real-time narrative descriptions (system-aided condition) vs. a memory-based condition where the same narrative information was only provided to users from the route's origin. We tested two groups of blind and visually impaired (BVI) users, including people above and below 60 years of age, as well as a third sighted control group. Evaluating older BVI participants is important, as the majority of vision loss is age-related, yet navigation performance using access technology is rarely studied with this demographic. Behavioral results demonstrated that access to real-time (system-aided) information led to better navigation accuracy and greater confidence by blind users compared to the information-matched memory condition. Performance for blind participants over 60 years old was nearly identical with their younger peers-an important outcome supporting the efficacy of using navigational technologies by this fast-growing population. Route completion accuracy and requests for assistance did not reliably differ between blind and sighted participants when using the system, suggesting that access to narrative route information led to functionally equivalent navigation behavior, irrespective of visual status. Survey results revealed strong user support for real-time information and provided important guidance for future interface refinements. 1 INTRODUCTION The reason that indoor navigation is often more challenging than traveling outdoors is partly due to technical limitations, as GPS-based positioning is unreliable within large buildings [ 1 ]. Although various technologies have been tested for supporting indoor localization (see Reference [ 2 ] for review), none have yet emerged as a widespread and widely used standard analogous to GPS-based outdoor navigation. Beyond technical limitations, the indoor navigation challenge is exacerbated by the nature of indoor spaces, as buildings are usually multi-level 3D structures with limited naming conventions for the walkable regions or addressing schemes of specific locations, i.e., the street names and building addresses that support outdoor travel [ 3 ]. As a consequence, it is more difficult to accurately represent the building structure on a real-time navigation map or to provide turn-by-turn verbal route instructions, as is available with outdoor navigation systems. These differences frequently make indoor navigation, especially when finding routes through large buildings, more frustrating and error-prone [ 4-6 ]. To help mitigate these indoor navigation challenges, architects and building developers utilize a host of aids to assist indoor wayfinding, such as maps, signs, directional arrows, alphanumeric room labels, and color-coded cues for distinguishing different spatial regions. Most of these tools for self-orientation and localization are visual in nature. As a result, anybody navigating in large, complex buildings is at a particular disadvantage if they cannot visually access this key wayfinding information, as is the case for blind or visually impaired (BVI) travelers or in situations when vision is not available (e.g., the power goes out, emergency response scenarios, etc.). This is a well-known problem and the quest for a viable solution has motivated an active research community studying technological approaches to support nonvisual indoor navigation, primarily for use by BVI travelers. Rather than focusing on technical development, our emphasis here was on investigating how use of the navigation system impacted behavioral performance. Several human factors and user interface (UI) parameters were addressed in the study, including: perception vs. memory-based information access, participant age, visual status, and collaborative navigation techniques. Results from route navigation performance (quantitative evaluation) and system usability evaluations (qualitative feedback) demonstrated that people perform best when they have access to real-time (perceptual) guidance from a navigation system and that this benefit is similarly manifested for both older and younger BVI participants and between sighted and blind users. In the remainder of this article, we (1) provide a background of accessible indoor navigation systems, (2) give an overview of the system we used to support the study, (3) discuss the relevance of our variables of interest with respect to probing how (and for whom) accessible navigation systems are generally used, (4) describe an in situ study carried out in a large university building using our system, and (5) couch the findings in terms of how they relate to existing research with navigation systems and
International Association for Development of the Information Society, 2018
Virtual reality applications for blind people in smartphones were used to make virtual visits in advance to unknown spaces; these need to include a set of cognitive and sensitive interfaces that allow users to use their other sensory capabilities to understand information about their environment and facilitate the interaction with the application, so that the user can make a mental representation of the unknown space. Some strategies were designed to provide continuous and clear information to the user, so that he can perform exploration activities within a virtual environment generated from a real environment with the help of nineteen blind people and five visually impaired people who participated in the development and tests carried out into six workshops, during twenty-four months. During each workshop took logs about the activities that the user did for the recognized and location of objects and structures indoors. This information was stored in a database to be analyzed and interpreted in order to make subsequent modifications to the application, until achieving a tool that is sufficiently useful, safe and accepted for the user. The last applications were built with voice patterns, beeps, vibrations and gestures called sensitive interfaces, and also with a cognitive interface called "Focus of attention" based on proximity and remote exploration. There was is a thirty-eight percent of improved when the participants choose to remote explore the virtual environment with regard to proximity exploration, also there was preference to be warned with low to medium frequency beeps, a fast reproduction of the voice to receive information on objects and structures and simple gestures for the interaction with Smartphone. In the last experience, we used a structure sensor coupled to the Smartphone for user tracking, and bone conduction headphones to reproduce spatial sounds; they said was pleasing to hear 3D sounds with personalized response in bone earphones, for locating objects inside the test scenario; there was a twenty-one percent of improved when the participants using beeps instead of vocals or musical instruments.
BlindeDroid: An Information Tracking System for Real-time Guiding of Blind People
Procedia Computer Science, 2015
Among the activities affected by visual impairment, navigation plays a fundamental role, since it enables the person to independently move in safety. The heterogeneous environment, easily perceived by visually enabled people, is hardly known by partially sighted people. A challenging task for these people is independent navigation in new spaces/buildings/environments. The environment is usually signaled and labeled with visual marks and signs which are not appropriate for blind persons. With the purpose of balancing the access to services and spaces among all persons, this work proposes an innovative navigation and information system to help the navigation of blind people within new environments (e.g. shopping center, public office building). Based on smartphones and wireless sensors deployed in the environment, we propose an information tracking system for realtime guide blind people (BlindeDroid). It offers guided navigation, answering questions, and providing objective information about places, products and services that are available surrounding the user.