VES: A Mixed-Reality Development Platform of Navigation Systems for Blind and Visually Impaired (original) (raw)

VES: A Mixed-Reality System to Assist Multisensory Spatial Perception and Cognition for Blind and Visually Impaired People

Applied Sciences, 2020

In this paper, the Virtually Enhanced Senses (VES) System is described. It is an ARCore-based, mixed-reality system meant to assist blind and visually impaired people’s navigation. VES operates in indoor and outdoor environments without any previous in-situ installation. It provides users with specific, runtime-configurable stimuli according to their pose, i.e., position and orientation, and the information of the environment recorded in a virtual replica. It implements three output data modalities: Wall-tracking assistance, acoustic compass, and a novel sensory substitution algorithm, Geometry-based Virtual Acoustic Space (GbVAS). The multimodal output of this algorithm takes advantage of natural human perception encoding of spatial data. Preliminary experiments of GbVAS have been conducted with sixteen subjects in three different scenarios, demonstrating basic orientation and mobility skills after six minutes training.

NAV-VIR: an audio-tactile virtual environment to assist visually impaired people*

2019 9th International IEEE/EMBS Conference on Neural Engineering (NER), 2019

This paper introduces the NAV-VIR system, a multimodal virtual environment to assist visually impaired people in virtually discovering and exploring unknown areas from the safety of their home. The originality of NAV-VIR resides in (1) an optimized representation of the surrounding topography, the spatial gist, based on human spatial cognition models and the sensorimotor supplementation framework, and (2) a multimodal orientation-aware immersive virtual environment relying on two synergetic interfaces: an interactive force feedback tablet, the F2T, and an immersive HRTF-based 3D audio simulation relying on binaural recordings of real environments. This paper presents NAV-VIR functionalities and its preliminary evaluation through a simple shape and movement perception task. I.

Sensory Navigation Device for Blind People

Journal of Navigation, 2013

This paper presents a new Electronic Travel Aid (ETA) ‘Acoustic Prototype’ which is especially suited to facilitate the navigation of visually impaired users. The device consists of a set of 3-Dimensional Complementary Metal Oxide Semiconductor (3-D CMOS) image sensors based on the three-dimensional integration and Complementary Metal-Oxide Semiconductor (CMOS) processing techniques implemented into a pair of glasses, stereo headphones as well as a Field-Programmable Gate Array (FPGA) used as processing unit. The device is intended to be used as a complementary device to navigation through both open known and unknown environments. The FPGA and the 3D-CMOS image sensor electronics control object detection. Distance measurement is achieved by using chip-integrated technology based on the Multiple Short Time Integration method. The processed information of the object distance is presented to the user via acoustic sounds through stereophonic headphones. The user interprets the informati...

AudiNect: an aid for the autonomous navigation of visually impaired people, based on virtual interface

In this paper, the realization of a new kind of autonomous navigation aid is presented. The prototype, called AudiNect, is mainly developed as an aid for visually impaired people, though a larger range of applications is also possible. The AudiNect prototype is based on the Kinect device for Xbox 360. On the basis of the Kinect output data, proper acoustic feedback is generated, so that useful depth information from 3D frontal scene can be easily developed and acquired. To this purpose, a number of basic problems have been analyzed, in relation to visually impaired people orientation and movement, through both actual experimentations and a careful literature research in the field. Quite satisfactory results have been reached and discussed, on the basis of proper tests on blindfolded sighted individuals.

Enabling Independent Navigation for Visually Impaired People through a Wearable Vision-Based Feedback System

— This work introduces a wearable system to provide situational awareness for blind and visually impaired people. The system includes a camera, an embedded computer and a haptic device to provide feedback when an obstacle is detected. The system uses techniques from computer vision and motion planning to (1) identify walkable space; (2) plan step-by-step a safe motion trajectory in the space, and (3) recognize and locate certain types of objects, for example the location of an empty chair. These descriptions are communicated to the person wearing the device through vibrations. We present results from user studies with low-and high-level tasks, including walking through a maze without collisions, locating a chair, and walking through a crowded environment while avoiding people.

Preliminary work for vocal and haptic navigation software for blind sailors

International Journal on Disability and Human Development, 2006

This study aims at the conception of haptic and vocal navigation software that permits blind sailors to create and simulate ship itineraries. This question implies a problematic about the haptic strategies used by blind people in order to build their space representation when using maps. According to current theories, people without vision are able to construct cognitive maps of their environment but the lack of sight tends to lead them to build egocentric and sequential mental pictures of space. Nevertheless, exocentric and unified representations are more efficient . Can blind people be helped to construct more effective spatial pictures? Some previous works have shown that strategies are the most important factors in spatial performance in large-scale space (Tellevik, 1992) (Hill et al, 1993) (Thinus-Blanc et al, 1997). In order to encode space in an efficient way, we made our subject use the cardinal points reference in small-scale space. During our case study, a compass establishes a frame of external cues. In this respect, we support the assumption that training based on systematic exocentric reference helps blind subjects to build unified space. At the same time, this training has led the blind sailor to change his haptic strategies in order to explore tactile maps and perform better. This seems to modify his processing of space representation. Eventually, we would like to study the transfer between map representation and environment mobility. Our final point is about using strategy based on cardinal points and haptic virtual reality technologies in order to help the blind improve their spatial cognition.

A Haptic-Audio Simulator Indoor Navigation: To Assist Visually Impaired Environment Exploration

International Journal of Information and Education Technology, 2016

electrical engineering from the same university, with honors (class I). Her research specialization is in medical simulation, virtual manipulation, haptic and real-time visual rendering, and development of assistive technologies for special needs citizens. She completed her work training in electrical engineering in 2002 after working with BHP as a degree cadet, Port Kembla Steelworks employed 1997-2002. She is currently employed as an associate professor in electrical engineering at the University of Wollongong in Dubai. She is the director of the Simulation and Smart Systems (S 3) Research Centre, which she established in April 2014. She has published in more than 30 publications, most with a high impact factor (IEEE, JLO) and ranked high in her research specialization. Dr. Todd is a professional member of IEEE and Signal Processing Society, and invited external referee for journals including the Int. Journ. Comp. Ass. Radiol. and Surg. (IJCARS), Pattern Recognition Letters, and Computer Animation and Virtual Worlds (CAVW)). She has received several teaching and research-based awards.

Towards a navigation system for blind people

ACM SIGACCESS Accessibility and Computing, 2012

In this paper, we present an initial study towards of an indoor navigation system for blind people. As the system itself is still in an early stage of development, we conducted a Wizard of Oz study using a generic Wizard of Oz system designed for mobile and ubiquitous studies. The goal of the study was to validate a set of audio-based navigation commands in a field study context. Further, we wanted to identify usability issues of the Wizard of Oz tool, and ensure the appropriateness of the addressed study setup. Therefore, we used eight human wizards as participants in the study. Their task was to guide two blindfolded actors through a predefined route. Such settings helped us to achieve high ecological validity of the results compared to laboratory testing. We found that the developed study setup is fully mobile and can be used in any mobile context, the voice commands chosen for navigation are almost complete, and can be used with slight modifications for the follow-up study. Additionally we identified several usability flaws of the Wizard of Oz tool. After implementing the findings, the tool and the study setup are ready for a follow-up study with blind persons in order to validate the selected voice commands in depth.

A Navigation Aid for Blind People

2011

This paper presents a navigation aid for the blind based on a microcontroller with synthetic speech output. The system consists of two vibrators, two ultrasonic sensors mounted on the user's shoulders and another one integrated into the cane. It is able to give information to the blind about urban walking routes and to provide real-time information on the distance of over-hanging obstacles within 6 m along the travel path ahead of the user. The suggested system can then sense the surrounding environment via sonar sensors and sending vibro-tactile feedback to the user of the position of the closest obstacles in range. For the ultrasonic cane, it is used to detect any obstacle on the ground. Experimental results show the effectiveness of the proposed system for blind navigation.

A Cost-effective Indoor Vibrotactile Navigation System for the Blind

This paper describes the development of an indoor vibrotactile navigation system for the visually impaired people. We aimed at realizing a wearable, low-cost, and effective system able to help blind users in unknown indoor environments that they might visit occasionally, such as hospitals, airports, museums, etc. The designed system implements a Bluetooth (BT) localization service, and provides wayfinding cues to the user by means of a wearable device equipped with five motors. The last part of our work describes early results obtained by the use of electroencephalographic (EEG) analysis to evaluate the vibrotactile feedback.