Audio stickies (visually-guided spatial audio annotations on a mobile augmented reality platform) (original) (raw)

Annotation in outdoor augmented reality

Computers & Graphics, 2009

Annotation, the process of adding extra virtual information to an object, is one of the most common uses for augmented reality. Although annotation is widely used in augmented reality, there is no general agreed-upon definition of what precisely constitutes an annotation in this context. In this paper, we propose a taxonomy of annotation, describing what constitutes an annotation and outlining different dimensions along which annotation can vary. Using this taxonomy we also highlight what styles of annotation are used in different types of applications and areas where further work needs to be done to improve annotation.

Guided by Voices: An Audio Augmented Reality System

International Conference on Auditory Display, 2000

This paper presents an application of a low cost, lightweight audio-only augmented reality infrastructure. The system uses a simple wearable computer and a RF based location system to play digital sounds corresponding to the user' s location and current state. Using this infrastructure we implemented a game in the fantasy genre where players move around in the real world and

Online creation of panoramic augmented reality annotations on mobile phones

2012

F e a t u r e : a u g m e n t e d r e a l i t y Online Creation of Panoramic augmentedreality annotations on mobile Phones J im Spohrer first envisioned the idea of superimposing georeferenced information using augmented reality (AR) in his 1999 essay on the WorldBoard. 1 This idea has recently gained popularity with applications such as Layar (http://layar.com), which use camera phones equipped with a compass and GPS as an inexpensive, albeit crude, platform for AR. However, GPS sensors and compasses have limited accuracy and can't provide precise pose information. Furthermore, these sensors have update rates of approximately 1 Hz, so overlays onto the live video image in a mobile phone's viewfinder are roughly placed, sometimes resembling a directional hint rather than an overlay matched to an exact location. Here, we present a novel system that improves compass accuracy using vision-based orientation tracking, enabling accurate object registration. However, vision tracking can only work in relation to an image database or 3D reconstruction, which must either be predetermined or constructed on the fly. We thus employ a natural-feature mapping and tracking approach for mobile phones that's efficient and robust enough to track with three degrees of freedom. By assuming pure rotational movements, the system creates a panoramic map from live video on the fly and simultaneously tracks from it (see ).

“Anywhere Augmentation”: Towards Mobile Augmented Reality in UnpreparedEnvironments,” Location Based Services and TeleCartography

Lecture Notes in Geoinformation and Cartography, …, 2007

We introduce the term "Anywhere Augmentation" to refer to the idea of linking location-specific computing services with the physical world, making them readily and directly available in any situation and location. This chapter presents a novel approach to "Anywhere Augmentation" based on efficient human input for wearable computing and augmented reality (AR). Current mobile and wearable computing technologies, as found in many industrial and governmental service applications, do not routinely integrate the services they provide with the physical world. Major limitations in the computer's general scene understanding abilities and the infeasibility of instrumenting the whole globe with a unified sensing and computing environment prevent progress in this area. Alternative approaches must be considered.

Mobile augmented reality with audio

Abstract In this article the use of augmented reality with a smartphone for fieldwork of Cultural Sciences students is discussed based on two pilots in Florence. A tool named ARLearn developed to support different learning in different contexts using the multimedia capabilities and location based service on smartphones. In the pilots assignments were given in spoken messages and students collected notes by recording their own voice and taking pictures of artifacts in Florence.

“Anywhere Augmentation”: Towards Mobile Augmented Reality in Unprepared Environments

Lecture Notes in Geoinformation and Cartography, 2007

We introduce the term "Anywhere Augmentation" to refer to the idea of linking location-specific computing services with the physical world, making them readily and directly available in any situation and location. This chapter presents a novel approach to "Anywhere Augmentation" based on efficient human input for wearable computing and augmented reality (AR). Current mobile and wearable computing technologies, as found in many industrial and governmental service applications, do not routinely integrate the services they provide with the physical world. Major limitations in the computer's general scene understanding abilities and the infeasibility of instrumenting the whole globe with a unified sensing and computing environment prevent progress in this area. Alternative approaches must be considered.

An automated AR-based annotation tool for indoor navigation for visually impaired people

The 23rd International ACM SIGACCESS Conference on Computers and Accessibility, 2021

Low vision people face many daily encumbrances. Traditional visual enhancements do not suffice to navigate indoor environments, or recognize objects efficiently. In this paper, we explore how Augmented Reality (AR) can be leveraged to design mobile applications to improve visual experience and unburden low vision persons. Specifically, we propose a novel automated AR-based annotation tool for detecting and labeling salient objects for assisted indoor navigation applications like NearbyExplorer. NearbyExplorer, which issues audio descriptions of nearby objects to the users, relies on a database populated by large teams of volunteers and map-a-thons to manually annotate salient objects in the environment like desks, chairs, low overhead ceilings. This has limited widespread and rapid deployment. Our tool builds on advances in automated object detection, AR labeling and accurate indoor positioning to provide an automated way to upload object labels and user position to a database, requiring just one volunteer. Moreover, it enables low vision people to detect and notice surrounding objects quickly using smartphones in various indoor environments. CCS Concepts: • Human-centered computing → Accessibility; Ubiquitous and mobile computing; Interaction design.

Design, implementation and evaluation of audio for a location aware augmented reality game

Proceedings of the 3rd International Conference on Fun and Games, 2010

In this paper, the development and implementation of a rich sound design, reminiscent of console gaming for a location aware game, Viking Ghost Hunt (VGH) is presented. The role of audio was assessed with particular attention to the effect on immersion and emotional engagement. Because immersion also involves the interaction and the creation of presence (the feeling of being in a particular place) these aspects of the sound design were also investigated. Evaluation of the game was undertaken over a three-day period with the participation of 19 subjects. The results gained imply that audio plays an important role in immersing a player within the game space and in emotionally engaging with the virtual world. However, challenges in regards to GPS inaccuracy and unpredictability remain, as well as device processor constraints, in order to create an accurate audio sound field and for the real-time rendering of audio files.

Auditory display design for exploration in mobile audio-augmented reality

2011

In this paper, we compare four different auditory displays in a mobile audio-augmented reality environment (a sound garden). The auditory displays varied in the use of non-speech audio, Earcons, as auditory landmarks and 3D audio spatialization, and the goal was to test the user experience of discovery in a purely exploratory environment that included multiple simultaneous sound sources. We present quantitative and qualitative results from an initial user study conducted in the Municipal Gardens of Funchal, Madeira. Results show that spatial audio together with Earcons allowed users to explore multiple simultaneous sources and had the added benefit of increasing the level of immersion in the experience. In addition, spatial audio encouraged a more exploratory and playful response to the environment. An analysis of the participants' logged data suggested that the level of immersion can be related to increased instances of stopping and scanning the environment, which can be quantified in terms of walking speed and head movement. Keywords Sound garden Á Spatial audio Á Auditory displays Á Eyes-free interaction Á Mobile audio-augmented reality Á Exploratory environments This work was partly presented at the Workshop on Multimodal Location-Based Techniques for Extreme Navigation at Pervasive 2010, Helsinki, Finland.

An effective approach to develop location-based augmented reality information support

International Journal of Electrical and Computer Engineering (IJECE), 2019

Using location-based augmented reality (AR) for pedestrian navigation can greatly improve user action to reduce the travel time. Pedestrian navigation differs in many ways from the conventional navigation system used in a car or other vehicles. A major issue with using location-based AR for navigation to a specific landmark is their quality of usability, especially if the active screen is overcrowded with the augmented POI markers which were overlap each other at the same time. This paper describes the user journey map approach that led to new insights about how users were using location-based AR for navigation. These insights led to a deep understanding of challenges that user must face when using location-based AR application for pedestrian navigation purpose, and more generally, they helped the development team to appreciate the variety of user experience in software requirement specification phase. To prove our concept, a prototype of intuitive location-based AR was built to be compared with existing standard-location based AR. The user evaluation results reveal that the overall functional requirements which are gathered from user journey have same level of success rate criteria when compared with standard location-based AR. Nevertheless, the field study participants highlighted the extended features in our prototype could significantly enhance the user action on locating the right object in particular place when compared with standard location-based AR application (proved with the required time).