A principled design methodology for auditory interaction (original) (raw)

Auditory Information Design

Auditory Information Design, 1998

The prospect of computer applications making “noises” is disconcerting to some. Yet thesoundscape of the real world does not usually bother us. Perhaps we only notice a nui-sance? This thesis is an approach for designing sounds that are useful information ratherthan distracting “noise”. The approach is called TaDa because the sounds are designed tobe useful in a Task and true to the Data.Previous researchers in auditory display have identified issues that need to be addressedfor the field to progress. The TaDa approach is an integrated approach that addresses anarray of these issues through a multifaceted system of methods drawn from HCI, visuali-sation, graphic design and sound design. A task-analysis addresses the issue of usefulness.A data characterisation addresses perceptual faithfulness. A case-based method providessemantic linkage to the application domain. A rule-based method addresses psychoacous-tic control. A perceptually linearised sound space allows transportable auditory specifica-tions. Most of these methods have not been used to design auditory displays before, andeach has been specially adapted for this design domain.The TaDa methods have been built into computer-aided design tools that can assist thedesign of a more effective display, and may allow less than experienced designers to makeeffective use of sounds. The case-based method is supported by a database of examplesthat can be searched by an information analysis of the design scenario. The rule-basedmethod is supported by a direct manipulation interface which shows the available soundgamut of an audio device as a 3D coloured object that can be sliced and picked with themouse. These computer-aided tools are the first of their kind to be developed in auditorydisplay.The approach, methods and tools are demonstrated in scenarios from the domains of min-ing exploration, resource monitoring and climatology. These practical applications showthat sounds can be useful in a wide variety of information processing activities which havenot been explored before. The sounds provide information that is difficult to obtain visu-ally, and improve the directness of interactions by providing additional affordances.

A Generic, Semantically-Based Design Approach for Spatial Auditory Computer Displays

2004

This paper describes a design approach for creating generic computer user interfaces with spatial auditory displays. It proposes a structured depiction process from formulating mode independent descriptions of user interfaces (UIs), to audio rendering methods for virtual environments. As the key step in the process a semantic taxonomy of user interface content is proposed. Finding semantic classifications of UI entities corresponding to properties of auditory objects is the ultimate goal. We beleive that this abstract approach detaches the process from visual paradigms and will reveal valuable insights into the representation of user interfaces in the auditory domain.

Visually supported design of auditory user Interfaces

2003

This paper describes the functionality, the used models and an evaluation of the Visual Auditory Interface Design (Visual-AID) prototyping tool which assists the Usability Engineering Process for Auditory User Interfaces (AUI). The process generalizes the experiences made in the INVITE-ZIB-Project, which creates an auditory Web browser for blind users (Donker et al. 2002). The complete user-centered design process defines three main components: guidelines and rules, auditory interaction objects (audIO) and a tool to generate evaluation mock-ups.

Towards a Modeling Language for Designing Auditory Interfaces

Universal Access in Human- …, 2009

Auditory applications are systems that communicate content, navigation capabilities and functionality mainly via the aural channel, or via a combination of the aural and visual channels, and can support the user interaction in a multimodal fashion as well (e.g. through touch or speech). In this paper, we present the preliminary results of an exploratory research effort aimed at establishing a design modeling language for auditory applications, by extending an existing interactive application design model (IDM, Interactive Dialogue Model) used in the area of hypermedia and information-intensive applications. Our exploratory research capitalizes on previous experience in hypermedia modeling, aural information architectures, and design of auditory applications. We use an auditory application, the Acoustic Edutainment Interface (AEDIN), as a real case study to inform and exemplify the use of the modeling language.

Keywords Sound Design, Auditory Display, Multimodal Interaction, Product Design, Interactive Arts and Music, Sound Perception and Cognition, Sound Modelling

2008

Sonic Interaction Design (SID) is an emerging field that is positioned at the intersection of auditory display, ubiquitous computing, interaction design, and interactive arts. SID can be used to describe practice and inquiry into any of various roles that sound may play in the interaction loop between users and artifacts, services, or environments, in applications that range from the critical functionality of an alarm, to the artistic significance of a musical creation. This field is devoted to the privileged role the auditory channel can assume in exploiting the convergence of computing, communication, and interactive technologies. An overemphasis on visual displays has constrained the development of interactive systems that are capable of making more appropriate use of the auditory modality. Today the ubiquity of computing and communication resources allows us to think about sounds in a proactive way. This workshop puts a spotlight on such issues in the context of the emerging domain of SID.

Layered Software Architecture for Designing Environmental Sounds in Non Visual Interfaces

2000

This paper presents a partially implemented layered software architecture for describing and designing environmental sounds (everyday sounds) in non visual interfaces based on a new sound model (audio framework). This architecture facilitates non-speech audio presentation of objects and interaction mechanisms to non visual interfaces. Physical layer, system sound software layer, sound analyser/synthesiser layer and interface layer define the different layers

A Methodological Framework for Auditory Display Design

2013

Common practice in the design of auditory display is hardly ever based on any structured design methodology. This leaves audio being widely underused or used inappropriately and inefficiently. We analyse the current status of research in this context and develop requirements for a methodological framework for auditory display design. Based on these requirements, we have created a framework of methods to capture, transfer and apply design knowledge based on design patterns paco ad. We present the context space as the organising principle to conceptualise the design space facilitating the matching of design knowledge with solutions and the workflow. Finally, we elaborate on how we intend to evaluate the framework and how it can be supported by tools. [

Interface Design and Dynamic Audio

Lecture Notes in Computer Science, 2014

In the age of digital devices, text, image, sound, interactivity, blend themselves into a symbiotic and unique media, presenting a multifaceted specie of language called hypermedia. However, since many years ago, we have seen a notable emphasis on visual communication´s interfaces, and due to its limitations, products and services in design can often present inconsistencies when other sensory properties are relevant, as in the case of sound information. This over-emphasis on visual displays has constrained the development of interactive systems that are capable of making better use of the auditory modality. Recognizing the HCI as an integrating element of media and visual, sound and tactile metaphors, this study will demonstrate investigations that contextualize the role of sound into interactive environments by proposing an overview for the term interactive sound, suggesting its classification into direct-interactive and indirect-adaptative sounds, and pointing out its meanings and applications.