Interactive Soundscapes: Developing A Physical Space Augmented Through Dynamic Sound Rendering And Granular Synthesis (original) (raw)

EcoSonico: Augmenting Sound and defining Soundscapes in a local Interactive Space.

Keywords Augmented Reality, soundscape, reactivision, IpodTouch, PureData, Objective C. Abstract In this paper we present he system design of an augmented reality system for the sound-art installation EcoSonico, the sound biodiversity in Mexico (EcoSónico La biodiversidad sonora en México), project sponsored by The National Phonoteca of México and the Department of Environment and Natural Resources of México (SEMARNAT) . The propose of the installation is to create a user individual sound experience through the selection of several surroundings soundscapes contained in a mobile device (Ipod Touch) that allows the user to navigate in an augmented sound reality. This is achieved by detecting the user position and orientation using a fiducial tracking system (reactivison), that reports his position to the mobile device, where an algorithm of binaural spatialization of virtual sound objects permits the user identify and navigate into the soundscape. Finally we present future ideas to interact with the system On Line in the definition of the soundscapes.

SoundScapeGenerator: soundscape modelling and simulation

Atti del XX CIM ‐ Colloquio di Informatica Musicale ‐ MUSICHE LIQUIDE, 2014

This paper describes SoundScapeGenerator, a generic system for modelling and simulating soundscapes both in real and non-real time. SoundsScapeGenerator features algorithms for three-dimensional sound-localisation and is able to model complex spaces. The system relies on abstract rule descriptions (generators) to simulate deterministic, “cartoonified” (loosely modelled) sequencing of sound events. It further supports virtual P.O.H. (point of hearing) and is able to render the final output in various channel configurations. Based on generative algorithms, the SoundScapeGenerator implementation allows real-time user interaction with ever-changing soundscapes, and it can easily communicate with other applications and devices. Finally, we introduce SoundScapeComposer, a higher level module developed for the SoDA project (dedicated to soundscape automatic generation), that has been built on top of SoundScapeGenerator, hiding most of the latter's details to the user.

Towards a Virtual Soundwalk

Advances in Civil and Industrial Engineering

This chapter presents the debate on the conceptual framework for the virtual soundwalk as a tool for soundscape assessment for use within urban design tasks and the management of urban open spaces. A hybrid model between a soundwalk in situ and a listening test in laboratory conditions is needed to gain benefits from both methods by simulating links between spatial relations and soundscape changes in actual urban open spaces. This link is vital due to the widely accepted architectural theory background on the urban open space experience. A prototype of a virtual soundwalk tool is described. It was used by the authors during laboratory research conducted in 2014 and 2015 and developed further in 2017. The prototype was based on partial virtual reality reconstruction of visual and aural field recordings. Its potential use is illustrated using a case study of the waterfront promenade in the historical centre of Zadar, Croatia. The future prospects for the method described are debated a...

Next generation soundscape design using virtual reality technologies

The Journal of the Acoustical Society of America, 2016

The quality of life including good sound quality has been sought by community members as part of the smart city initiative. While many governments have placed special attention to waste management, air and water pollution, acoustic environment in cities has been directed toward the control of noise, in particular, transportation noise. Governments that care about the tranquility in cities rely primarily on setting the socalled acceptable noise levels i.e. just quantities for compliance and improvement. Sound quality is always ignored. Recently, the ISO released a series of standards on soundscape. However, sound quality is a subjective matter and depends heavily on the perception of humans in different contexts. The study of soundscape and sound quality is not easy. According to the ISO soundscape guideline, people can use sound walks, questionnaire surveys, and even lab tests to determine sound quality during a soundscape design process. With the advance of virtual reality technologies, we believe that the current technology enables us to create an application that immerses designers and stakeholders in the community to perceive and compare changes in sound quality and to provide feedback on different soundscape designs. An app has been developed specifically for this purpose.

AcouRe: Streamlining the generation of immersive soundscapes from monophonic sources

The growing number of curated and crowdsourced online sound databases has been transforming several branches of music from production to consumption. In this context, the operational basis of sound design practice has been gradually shifting to incorporate repurposing strategies. One of the most pressing matters in this new context is the retrieval and transformation of audio content to meet the creative endeavours of the sound designer. The latter is the focus of our work, which aims to streamline the sound design process in the context of post-production, particularly when adopting large amounts of poorly annotated data. Different methodologies at the crossroads of machine listening and learning (namely, pattern recognition) are adopted to propose the automatic identification of sound events components from soundscape recordings to be repurposed in the temporal and spatial domains. A prototype interface, named AcouRe, implements the proposed methodologies. A heuristic evaluation to assess the usability of AcouRe was conducted by two expert sound designers.

Sound in Space On Data Sonication and Urban Soundscape (slides)

2016

Mobile devices have been used in soundscape installations and performances over the past decade or longer, often to emphasize social interaction. Multichannel sonification has been found to successfully represent data describing kinematic phenomena. However, there are few if any examples where these two approaches are combined. The Locust Wrath project has evolved in stages: first, as surround sonifications of climate data for a multimedia dance performance; then, as a frontal display sound installation and as material in a live performance of 'musical' interactive sonification; and recently, as an audience participator work. We developed a system for spatialized sonification of data using a server-client model with iOS devices. In two multimedia performances, the audience members' iPhones were employed ad hoc to constitute a large auditory display. This paper describes the artistic background to the project, outlines the stages, and focuses on the design and implementation of the Locust Wrath client app.

SoundScapes

ACM SIGGRAPH 2007 educators program on - SIGGRAPH '07, 2007

Non-formal learning is evident from an inhabited information space that is created from non-invasive multi-dimensional sensor technologies that source human gesture. Libraries of intuitive interfaces empower natural interaction where the gesture is mapped to the multisensory content. Large screen delivery and surround sound deliver the content for direct and immediate association between gesture and content response. Participant creative expression and game playing is stimulated toward engaged motivation in therapeutic sessions to optimize participation, both for client and facilitator. National and international bodies have consistently recognized SoundScapes which, as a research body of work, is directly responsible for numerous patents.

Simulating the Soundscape through an Analysis/Resynthesis Methodology

Auditory Display, 2010

This paper presents a graph-based system for the dynamic generation of soundscapes and its implementation in an application that allows for an interactive, real-time exploration of the resulting soundscapes. The application can be used alone, as a pure sonic exploration device, but can also be integrated into a virtual reality engine. In this way, the soundcape can be acoustically integrated in the exploration of an architectonic/urbanistic landscape. The paper is organized as follows: after taking into account the literature on soundscape, we provide a formal definition of the concept; then, a model is introduced, and finally, we describe a software application together with a case-study