Blind User Visualization and Interaction in Touch Screen: A Designer Perspective (original) (raw)

Designing a learning environment for visually impaired and blind persons in order to develop touch access to digital content

Modelling, Measurement and Control C

Through screen-reader and Braille display, trained blind persons can nowadays manage to access to a lot of activities using computers. However, graphical interfaces and content where the spatial dimension is essential for understanding, like charts, pictures or the majority of videogames, are remaining hardly accessible. The Tactos and Intertact.net technologies are aimed to overcome these limits by providing an efficient sensory supplementation technology enabling blind users to access the spatial dimension of content through touch. Following a participatory design approach, we have worked in cooperation with blind persons to develop a learning environment for touch access to digital content with Tactos. Adoption is important when it comes to develop technologies and we report here on the research we conduct for enabling an independent learning of our system by blind persons. From our perspective, this possibility is a cornerstone for the development of a users' community.

Designing Accessible Visualizations: The Case of Designing a Weather Map for Blind Users

Universal Access in Human-Computer Interaction. Design Methods, Tools, and Interaction Techniques for eInclusion, 2013

Major strides have been made to improve the accessibility of textbased documents for blind users, however, visualizations still remain largely inaccessible. The AISP framework represents an attempt to streamline the design process by aligning the information seeking behaviors of a blind user with those of a sighted user utilizing auditory feedback. With the recent popularity of touch-based devices, and the overwhelming success of the talking tactile tablet, we therefore suggest that the AISP framework be extended to include the sense of touch. This research-in-progress paper proposes such an extended design framework, MISD. In addition, the article also presents the preliminary work done in designing an accessible weather map based on our theory-driven design. A discussion and an outline of future work conclude the manuscript.

A comparative study of multimodal digital map interface designs for blind users

International Journal of Artificial Intelligence and Soft Computing, 2015

This research project investigated the issues and opportunities associated with making digital mapping and spatial data more accessible and usable for the blind. Geographical information systems (GIS) enable the storage and manipulation of raster and vector data based upon the spatial relationships of individual features. While GIS predominantly rely on the visual medium for the presentation of data, the ability to store and manipulate spatial relationships offers opportunities to present data using alternative modalities such as speech, sound and haptic feedback. A group of eight blind participants took part in a task driven experiment using two interface designs. The effectiveness of each interface design was assessed through task performance and user experience questionnaires. The results obtained suggest that interface design, data presentation and the use of multiple modalities has the potential to enhance accessibility, usability and support spatial cognition.

Analysing the Mental Model of Blind Users in Mobile Touch Screen Devices for Usability

2013

Understanding the concept of mental models can help to make sense of usability problems in software design. Yet the software engineer and developer faces many difficulties in the proper understanding of the user behaviour during interaction with the system. This paper presents a mental model of born blind users for touch screen devices to aid developers in designing and building better applications for blind users. The work of Kurniawan was preferred as a basis to examine the mental model of blind users. A user case study was carried out to assess the subjective mental model of users in regards to application usability. Our research revealed that two stages, namely: listening and interaction are used to facilitate the development of the Kurniawan mental model. This paper also suggests eight usability features that can be added to facilitate the design phase of the system. As far as is known this is a novel suggestion.

Analyzing App Inventor for Building Usable Touch Screen Courseware for Blind Users

International Conference on eBusiness, eCommerce, eManagement, eLearning and eGovernance 2014, 2014

This paper focuses on how to design and develop a usable Touch Screen Mobile Courseware Application (TMCA) by using usability features recommended in the literature. For our study, we consider usability features that are imperative for blind users such as widget usability, validation, feedback and navigation. We explain the usability features needed to address the relevant usability issues in design and development of TMCA for blind users. We also suggest the appropriate location for control items or widgets in a flat screen. Furthermore, we make recommendations for ways to satisfy usability features through existing features. The study shows that prospective educators cum developers can implement usability features as a base for addressing usability for blind users to a significant extent. Although the usability features are guided by the literature, said features are simply defined in the literature. There is no guide for designers and developers demonstrating how exactly to achieve such features. The proposed approach in this paper makes the TMCA more usable for blind users and it is more likely to achieve its objectives.

Designing interactive content with blind users for a perceptual supplementation system

Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction - TEI '13, 2013

With the spread of ICT and the Internet during the last two decades, more and more tools rely on graphical interfaces that make them scarcely accessible to the visually impaired. The ITOIP project aims at developing the use of a tactile sensory supplementation system called Tactos. A partnership with a visually impaired persons (VIP) association allows us to conduct a participatory design approach intended to gather first community of users around our system. This article reports on the design approach we have implemented in order to develop usable and useful applications for VIP users. Through a rapid prototyping process we address the development of the use of our technology with blind users representatives. We present the interaction and use principles highlighted from the design of three Tactos applications: a tutorial, a city plan exploration system and a country level map application.

Eliciting Usability from Blind User Mental Model for Touch Screen Devices

Transactions on Engineering Technologies, 2014

A novel mental model of born blind users for touch screen devices to aid software developers in designing and building better usability applications for blind users is presented. The work of Kurniawan was the basis to examine the mental model of blind users. A user case study was carried out to assess the subjective mental model of users in regards to application of usability. This revealed that two stages: listening and interaction are used to facilitate the development of the Kurniawan mental model. This paper also suggests twelve usability features that can be added to facilitate the design phase of the system.

Blind-friendly user interfaces – a pilot study on improving the accessibility of touchscreen interfaces

Multimedia Tools and Applications, 2019

Touchscreen devices such as a smartphone, smartwatch, and tablets are essential assistive devices for visually impaired and blind people in performing activities of daily living. The vision alternative accessibility services such as screen readers, multimodal interactions, vibro-tactical, haptic feedback, and gestures are helping blind people in operating touchscreen interfaces. Part of usability problem with today touchscreen user interfaces contributes to a trade-off in discoverability, navigational complexity, cognitive overload, layout persistency, a cumbersome input mechanism, accessibility, and cross-device interactions. One solution to these problems is to design an accessibility-inclusive blind-friendly user interface framework for performing common activities on a smartphone. This framework re-organizes/re-generates the interface components into a simplified blind-friendly user interface based on user profile and contextual recommendations. The paper reports an improvement in the user experience of blind people in performing activities on a smartphone. Forty-one blind people have participated in this empirical study, resulting in improved users and interaction experience in an operating smartphone.

Drawing for Blind Learners: Assistive Technology for Graphical Design

2016 IEEE 16th International Conference on Advanced Learning Technologies (ICALT), 2016

One of the main problems faced by blind learners is a lack of drawing technologies that support images and diagram drawing without the help of a sighted support worker. Even though some technologies have been experimented with in the past, blind learners have not been keen on tactile drawing due to: the difficulty of the drawing task, the length of time taken to complete a simple task, and the inefficiency of the drawing experience. This paper presents a set of grammar tools that can be used by blind people to find spatial orientation, as well as knowledge and representation of the interactive drawing environment, to improve the understanding of a concept or a subject matter. The grammar can be adopted and personalised in education and learning when presenting technical diagrams. The grammar enables users to present graphics by mapping the cognitive visualisation of blind people into spatial information on a computer screen. This technique promotes an interactive and easy drawing environment to build objects, associations, and layout information by zooming, navigation, and grouping. It will lead to future possibilities such as 3D world modelling, printing, and multisensory integration of inputs and output methods.