Increasing Accessibility to the Blind of Virtual Environments, Using a Virtual Mobility Aid Based On the "EyeCane": Feasibility Study (original) (raw)
Related papers
3D Virtual Environments for the Rehabilitation of the Blind
Lecture Notes in Computer Science, 2009
The accretion of orientation and mobility skills in blind people is fundamental for the development of an independent life. To these ends, activities oriented towards reinforcing this line of knowledge require direct interactions with spaces in real places, and the assistance of an educator or a companion as well. The objective of this study was to design, implement and evaluate 3D virtual environments in order to evaluate orientation and mobility learning in an unfamiliar environment based on the use of such environments. The procedure was provided by a learning stage in which the user learned to move about in the virtual environments, followed by an interaction stage in which he/she traveled virtually through the environments, to then travel the virtual environments that had been navigated virtually in the real world. To simulate the virtual surroundings, Unreal Engine was used, which is a gaming engine that allows for the construction of scenarios through a graphic editor. The results obtained show that the users were able to run the established route without any difficulties, for which reason it can be established that it is possible to produce mental models of real places through virtual interactions guided only by auditory cues.
HOMERE: a Multimodal System for Visually Impaired People to Explore Virtual Environments
2003
The paper describes the HOMERE system: a multimodal system dedicated to visually impaired people to explore and navigate inside virtual environments. The system addresses three main applications: preparation for the visit of an existing site, training for the use of a blind cane, and ludic exploration of virtual worlds. The HOMERE system provides the user with different sensations when navigating inside a virtual world: a force feedback corresponding to the manipulation of a virtual blind cane, a thermal feedback corresponding to the simulation of a virtual sun, and an auditory feedback in spatialized conditions corresponding to the ambient atmosphere and specific events in the simulation. A visual feedback of the scene is also provided to enable sighted people to follow the navigation of the main user. HOMERE has been tested by several visually impaired people who were all confident about the potential of this prototype.
Virtual reality as a means to explore assistive technologies for the visually impaired
PLOS Digital Health
Visual impairment represents a significant health and economic burden affecting 596 million globally. The incidence of visual impairment is expected to double by 2050 as our population ages. Independent navigation is challenging for persons with visual impairment, as they often rely on non-visual sensory signals to find the optimal route. In this context, electronic travel aids are promising solutions that can be used for obstacle detection and/or route guidance. However, electronic travel aids have limitations such as low uptake and limited training that restrict their widespread use. Here, we present a virtual reality platform for testing, refining, and training with electronic travel aids. We demonstrate the viability on an electronic travel aid developed in-house, consist of a wearable haptic feedback device. We designed an experiment in which participants donned the electronic travel aid and performed a virtual task while experiencing a simulation of three different visual impa...
A Virtual Map to Support People Who are Blind in Navigation through Real Spaces
Journal of Special Education Technology, 2011
Most of the spatial information needed by sighted people to construct cognitive maps of spaces is gathered through the visual channel. Unfortunately, people who are blind lack the ability to collect the required spatial information in advance. The use of virtual reality as a learning and rehabilitation tool for people with disabilities has been on the rise in recent years. This research is based on the hypothesis that the advance supply of appropriate spatial information (perceptual and conceptual) through compensatory sensorial channels within a virtual environment may assist people who are blind in their anticipatory exploration and cognitive mapping of the unknown space. In this long-term research we developed and tested the BlindAid system that combines 3D audio with a Phantom® haptic interface to allow the user to explore a virtual map through a hand held stylus. The main goals of this research were to study the cognitive mapping process of people who are blind when exploring c...
The "EyeCane", a new electronic travel aid for the blind: Technology, behavior & swift learning
Restorative neurology and neuroscience, 2014
Independent mobility is one of the most pressing problems facing people who are blind. We present the EyeCane, a new mobility aid aimed at increasing perception of environment beyond what is provided by the traditional White Cane for tasks such as distance estimation, navigation and obstacle detection. The "EyeCane" enhances the traditional White Cane by using tactile and auditory output to increase detectable distance and angles. It circumvents the technical pitfalls of other devices, such as weight, short battery life, complex interface schemes, and slow learning curve. It implements multiple beams to enables detection of obstacles at different heights, and narrow beams to provide active sensing that can potentially increase the user's spatial perception of the environment. Participants were tasked with using the EyeCane for several basic tasks with minimal training. Blind and blindfolded-sighted participants were able to use the EyeCane successfully for distance est...
In-Place Virtual Exploration Using a Virtual Cane: An Initial Study
Companion Proceedings of the 2023 Conference on Interactive Surfaces and Spaces
In this initial study, we addressed the challenge of assisting individuals who are blind or have low vision (BLV) in familiarizing themselves with new environments. Navigating unfamiliar areas presents numerous challenges for BLV individuals. We sought to explore the potential of Virtual Reality (VR) technology to replicate real-world settings, thereby allowing users to virtually experience these spaces at their convenience, often from the comfort of their homes. As part of our preliminary investigation, we designed an interface tailored to facilitate movement for BLV users without needing physical mobility. Our study involved six blind participants. Early findings revealed that participants encountered difficulties adapting to the interface. Post-experiment interviews illuminated the reasons for these challenges, including issues with interface usage, the complexity of managing multiple interface elements, and the disparity between physical movement and interface use. Given the early stage of this research, these findings provide valuable insights for refining the approach and improving the interface in future iterations. CCS CONCEPTS • Human-centered computing → Accessibility design and evaluation methods; User studies.
Virtual environments for the training of visually impaired
2002
In recent years researchers have started developing force feedback interfaces, which permit blind people not only to access bi-dimensional graphic interfaces (as was the case until now), but in addition to access information present on 3D Virtual Reality interfaces anticipating that the latter will be the natural form of information interchange in the very near future [1].
In this paper, the realization of a new kind of autonomous navigation aid is presented. The prototype, called AudiNect, is mainly developed as an aid for visually impaired people, though a larger range of applications is also possible. The AudiNect prototype is based on the Kinect device for Xbox 360. On the basis of the Kinect output data, proper acoustic feedback is generated, so that useful depth information from 3D frontal scene can be easily developed and acquired. To this purpose, a number of basic problems have been analyzed, in relation to visually impaired people orientation and movement, through both actual experimentations and a careful literature research in the field. Quite satisfactory results have been reached and discussed, on the basis of proper tests on blindfolded sighted individuals.
Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 2018
Traditional virtual reality (VR) mainly focuses on visual feedback, which is not accessible for people with visual impairments. We created Canetroller, a haptic cane controller that simulates white cane interactions, enabling people with visual impairments to navigate a virtual environment by transferring their cane skills into the virtual world. Canetroller provides three types of feedback: (1) physical resistance generated by a wearable programmable brake mechanism that physically impedes the controller when the virtual cane comes in contact with a virtual object; (2) vibrotactile feedback that simulates the vibrations when a cane hits an object or touches and drags across various surfaces; and (3) spatial 3D auditory feedback simulating the sound of real-world cane interactions. We designed indoor and outdoor VR scenes to evaluate the effectiveness of our controller. Our study showed that Canetroller was a promising tool that enabled visually impaired participants to navigate different virtual spaces. We discuss potential applications supported by Canetroller ranging from entertainment to mobility training.
International Association for Development of the Information Society, 2018
Virtual reality applications for blind people in smartphones were used to make virtual visits in advance to unknown spaces; these need to include a set of cognitive and sensitive interfaces that allow users to use their other sensory capabilities to understand information about their environment and facilitate the interaction with the application, so that the user can make a mental representation of the unknown space. Some strategies were designed to provide continuous and clear information to the user, so that he can perform exploration activities within a virtual environment generated from a real environment with the help of nineteen blind people and five visually impaired people who participated in the development and tests carried out into six workshops, during twenty-four months. During each workshop took logs about the activities that the user did for the recognized and location of objects and structures indoors. This information was stored in a database to be analyzed and interpreted in order to make subsequent modifications to the application, until achieving a tool that is sufficiently useful, safe and accepted for the user. The last applications were built with voice patterns, beeps, vibrations and gestures called sensitive interfaces, and also with a cognitive interface called "Focus of attention" based on proximity and remote exploration. There was is a thirty-eight percent of improved when the participants choose to remote explore the virtual environment with regard to proximity exploration, also there was preference to be warned with low to medium frequency beeps, a fast reproduction of the voice to receive information on objects and structures and simple gestures for the interaction with Smartphone. In the last experience, we used a structure sensor coupled to the Smartphone for user tracking, and bone conduction headphones to reproduce spatial sounds; they said was pleasing to hear 3D sounds with personalized response in bone earphones, for locating objects inside the test scenario; there was a twenty-one percent of improved when the participants using beeps instead of vocals or musical instruments.