Constructing relational diagrams in audio: the multiple perspective hierarchical approach (original) (raw)
Related papers
Interactive hierarchy-based auditory displays for accessing and manipulating relational diagrams
Journal on Multimodal User Interfaces, 2011
An approach to designing hierarchy-based auditory displays that supports non-visual interaction with relational diagrams is presented. The approach is motivated by an analysis of the functional and structural properties of relational diagrams in terms of their role as external representations. This analysis informs the design of a multiple perspective hierarchy-based model that captures modality independent features of a diagram when translating it into an audio accessible form. The paper outlines design lessons learnt from two user studies that were conducted to evaluate the proposed approach.
Comparing interaction strategies for constructing diagrams in an audio-only interface
2008
ABSTRACT Although research on non-visual access to visualisations is steadily growing, very little work has investigated strategies for constructing such forms of representation through non-visual means. This paper describes the design of two interaction strategies for constructing and manipulating relational diagrams in audio. We report on a study that compared the two strategies, and discuss their advantages and disadvantages in terms of how efficiently they support the activity of constructing diagrams in an audio-only interface.
Using hierarchies to support non-visual access to relational diagrams
2007
This paper describes an approach to support non-visual exploration of graphically represented information. We used a hierarchical structure to organize the information encoded in a relational diagram and designed two alternative audio-only interfaces for presenting the hierarchy, each employing different levels of verbosity. We report on an experimental study that assessed the viability of our proposed approach as well as the efficiency and learnability of each interface. Our results show that the relational information encoded in a diagram could be non-visually navigated and explored through a hierarchy, and that substituting verbal descriptions of parts of such information with nonverbal sounds significantly improve performance without compromising comprehension.
Issues in the Non-Visual Presentation of Graph Based Diagrams
2004
One aspect of non-visual visualisation is providing accessibility to diagrams for users with visual impairments. In this paper we review the literature on diagrams and nonvisual presentation in order to elucidate the issues involved in making graph based diagrams accessible using speech and non-speech sound. First we examine the nature, scope and uses of these diagrams. We then describe the nature of diagrams: how do diagrams differ from other representations; how do sighted readers read, understand, and extract information from diagrams; what cognitive processes do diagrams facilitate; and what factors affect how diagrams may be understood? After a comparison of visual with aural presentation we discuss the work presented by others in this field, particularly looking at their reasons for implementing how they do in light of our examination of visual comprehension. The paper concludes with a discussion of how these issues combine and conflict to influence requirements for interface design.
Using Hierarchies to Suuport Non Visual Access to Diagrams HCI07
This paper describes an approach to support non-visual exploration of graphically represented information. We used a hierarchical structure to organize the information encoded in a relational diagram and designed two alternative audio-only interfaces for presenting the hierarchy, each employing different levels of verbosity. We report on an experimental study that assessed the viability of our proposed approach as well as the efficiency and learnability of each interface. Our results show that the relational information encoded in a diagram could be non-visually navigated and explored through a hierarchy, and that substituting verbal descriptions of parts of such information with nonverbal sounds significantly improve performance without compromising comprehension.
Making Graph-Based Diagrams Work in Sound: The Role of Annotation
Human–Computer Interaction, 2013
Non-linear forms of diagrammatic presentation, such as node-arc graphs, are a powerful and elegant means of visual information presentation. While providing non-visual access is now routine for many forms of linear information, it becomes more difficult as the structure of the information becomes increasingly non-linear. An understanding of the ways in which graphs benefit sighted people, based on experiments and the literature, together with the difficulties encountered when exploring graphs non-visually, helps form a solution for non-visual access to graphs. This paper proposes that differing types of annotation offer a powerful and flexible technique for transferring the benefits of graph based diagrams, as well as for reducing disorientation while moving around the graph and for tackling some of the inherent disadvantages of using sound. Different forms of annotation that may address these problems are explored, classified and evaluated, including notes designed to summarise and to aid node differentiation. Graph annotation may be performed automatically, creating a graph that evaluation shows requires less mental effort to explore, and on which tasks can be achieved more effectively and more efficiently.
Collaborating through sounds: audio-only interaction with diagrams
2010
The widening spectrum of interaction contexts and users' needs continues to expose the limitations of the Graphical User Interface. But despite the benefits of sound in everyday activities and considerable progress in Auditory Display research, audio remains under-explored in Human-Computer Interaction (HCI). This thesis seeks to contribute to unveiling the potential of using audio in HCI by building on and extending current research on how we interact with and through the auditory modality.
2020 IEEE Frontiers in Education Conference (FIE)
This Research Full Paper presents the design and evaluation of an auditory interface that allows blind people to perceive workspace awareness elements in a shared environment for modeling diagrams. Approximately 36 million Brazilians are visually impaired, where half a million are blind, and the enrollment of the blind at Brazilian higher public education institutions has been increasing. However, accessible tools for diagram authoring are scarce, despite their necessity on STEM courses. There are studies and new developments regarding the perception and authoring of diagrammatic information, but they do not consider collaborative features. Therefore, this study proposes the sonification of workspace awareness elements as a solution to enable accessible groupware features for the blind. It was designed auditory cues for three awareness categories (who, what, and where) using abstract sounds and favoring suitable metaphors for an intuitive mapping. The designed auditory cues were implemented in a prototype application for collaborative diagram modeling and evaluated through an objective usability test. The results demonstrated that participants could reasonably distinguish who made what and where. Besides, it was possible to identify the awareness categories which need more work.
Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration
This report presents a series of explorations into the feasibility of using low cost devices to design support for non-visual interaction with diagrams. These explorations are a follow up to the Collaborative Cross-modal Interfaces project (CCmI) 1, which explored the potential of using multimodal input and output technologies (audio, haptics, graphics) to improve the accessibility of collaboration between visually-impaired and sighted individuals when using diagrams in the workplace.
We present an approach that examines the design of auditory displays for accessing graphically represented information in terms of their roles as external representations. This approach describes how a cross-modal translation process should emphasise the semantics of the represented information rather than the structural features of the medium that presents it. We exemplify this by exploring the design of a hierarchical representation to organise relational information encoded in a UML class diagram, and describe two alternative presentation modes to auditorally communicate this structure. We report on an experiment that we conducted to assess the viability of our approach and describe a novel methodological analysis which extends existing evaluation techniques to formally examine how a group of users learn and develop interactive expertise when using this auditory display.