MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED (original) (raw)

Today increasing shares of mobile phones are so-called “smart phones”. They come equipped with touchbased screens, enabling the user to interact in an easier and efficient way, compared to standard buttons. However, such screens require visual navigation, apparently ruling out access for the visually impaired. In this paper we ask, how can touch-based phones be extended to be applicable for the visually impaired? Two possible technologies are screen-readers and haptics (tactile feedback). In this paper we suggest a solution based on a combination of voice and haptics. Design research was chosen as the methodology for the project. Design research highlights the importance of developing a solution over the course of several iterations, and to perform product evaluation using external participants. The research contribution is an Android based prototype that demonstrates the new user interface, allowing the visually impaired to seamlessly interact with a smart-phone. Operation relies on voice and haptic feedback, where the user receives information when tapping or dragging the finger across the screen. The proposed solution is unique in several ways, it keeps gestures to a minimum, it does not rely on physical keys, and it takes the technologies of screen readers and haptics one step further.

Sign up for access to the world's latest research.

checkGet notified about relevant papers

checkSave papers to use in your research

checkJoin the discussion with peers

checkTrack your impact