Design Implications For Technology-Mediated Audience Participation In Live Music (original) (raw)
Related papers
Designing Interactive Audience Participation Using Smart Phones in a Musical Performance
In this paper we describe the design and evaluation of an interactive system for audience participation in live per-formances using smart phones to control the stereo pano-rama of the lead guitar. The system was developed through feedback from both spectators and artists. The evaluation was conducted during a live concert and builds on inter-views and video analysis. Findings include that musicians seem to be cautious about giving up control and that the audience at the same time wants a reasonable amount of control and clear feedback which in turn can be obtrusive to other spectators. We outline that balancing constraints with affordances is the key to both the audience's and mu-sicians' acceptance of such a system and that a playful participatory design process can lead to better results in this regard. It is also shown that using smart phones opens up a large possibility space but at the same time their use has to be subtle to not distract too much from the music.
Interaction Design for Live Performance
The role of interactive technology in live performance has increased substantially in recent years. Practices and experiences of existing forms of live performance have been transformed and new genres of technology-mediated live performance have emerged in response to novel technological opportunities. Consequently, designing for live performance is set to become an increasingly important concern for interaction design researchers and practitioners. However, designing interactive technology for live performance is a challenging activity, as the experiences of both performers and their audiences are shaped and influenced by a number of delicate and interconnected issues, which relate to different forms and individual practices of live performance in varied and often conflicting ways. The research presented in this thesis explores how interaction designers might be better supported in engaging with this intricate and multifaceted design space. This is achieved using a practice-led methodology, which involves the researcher’s participation in both the investigation of, and design response to, issues of live performance as they are embodied in the lived and felt experiences of individual live performers’ practices during three interaction design case studies. This research contributes to the field of interaction design for live performance in three core areas. Understandings of the relationships between key issues of live performance and individual performers’ lived and felt experiences are developed, approaches to support interaction designers in engaging individual live performers’ lived and felt experiences in design are proposed and innovative interfaces and interaction techniques for live performance are designed. It is anticipated that these research outcomes will prove directly applicable or inspiring to the practices of interaction designers wishing to address live performance and will contribute to the ongoing academic discourse around the experience of, and design for, live performance.
TMAP Design Cards for Technology-Mediated Audience Participation in Live Music
2019
Historically, audiences have had various ways to participate in live music performances, including clapping, dancing, swaying, whistling, and singing. More recently, mobile and wireless devices, such as smartphones have opened up powerful new opportunities for audience participation. However, design for technology-mediated audience participation (TMAP) can be challenging: musicians and audiences have different demands, as does the coherence of the music, and group needs can vary widely. Thus, effective TMAP design requires the balancing of knowledge from diverse perspectives and must take into account the needs of diverse roles in creating and supporting performances. This chapter focuses on the process of creating and evaluating a set of design cards to support the interaction design and evaluation of TMAP systems. The cards are based on a previously created descriptive framework for supporting interaction design and evaluation in this challenging area. We discuss the conception an...
Playful Technology-Mediated Audience Participation in a Live Music Event
Extended Abstracts Publication of the Annual Symposium on Computer-Human Interaction in Play
This paper presents the evaluation of playful technologymediated audience participation (TMAP) during three music performances in a recent music event. It captures preliminary impressions from a wide range of perspectives and includes critical reflections of music artists, video analysis and qualitative interviews with audience members to cover hypotheses designed to capture both the artists' and the audience's point of view. Results indicate a willingness from both sides to engage in playful TMAP, and a high potential for exploration and playful collaboration within the audience, but the experience is restricted by the need to retain control on the side of artists and the need for clear instructions, feedback and reliable technical systems on the side of the audience.
Large-scale audience participation in live music using smartphones
Journal of New Music Research, 2020
We present a study and reflection about the role and use of smartphone technology for a large-scale musical performance involving audience participation. We evaluated a full design and development process from initial ideation to a final performance concept. We found that the smartphone became the design tool, the technical device and the musical instrument at the same time. As a technical device that uses ultrasound communication as interaction technique, the smartphone became inspirational for the artist's creative work. In aiming to support the artist, we observed pervasive importance of retaining artistic control to realise artistic intent. This concerns the co-design process and the resulting concept of audience participation and supports recommendations for such participatory work.
Interaction, instruments and performance: HCI and the design of future music technologies
2017
Rationale There has been little chance for researchers, performers and designers in the UK to come together in order to explore the use and design of new and evolving technologies for performance. This workshop examines the interplay between people, musical instruments, performance and technology. Now, more than ever technology is enabling us to augment the body, develop new ways to play and perform, and augment existing instruments that can span the physical and digital realms. By bringing together performers, artists, designers and researchers we aim to develop new understandings how we might design new performance technologies. Some Topics - Methods and Approaches; What are the methods and approaches that we can employ to understanding interaction and interplay in performance and what impact does technology have on this? - Sonic Augmentation; can performance and sound change the experiential attributes of places, e.g. make them more accessible, more playful? -Physical/digital aug...
Digital Musical Interactions: Performersystem relationships and their perception by spectators
Organised Sound, 2011
This article adopts an ecological view of digital musical interactions, first considering the relationship between performers and digital systems, and then spectators' perception of these interactions. We provide evidence that the relationships between performers and digital music systems are not necessarily instrumental in the same was as they are with acoustic systems, and nor should they always strive to be. Furthermore, we report results of a study suggesting that spectators may not perceive such interactions in the same way as performances with musical instruments. We present implications for the design of digital musical interactions, suggesting that designers should embrace the reality that digital systems are malleable and dynamic, and may engage performers and spectators in different modalities, sometimes simultaneously.
2015
With the rise of ubiquitous computing, comes new possibilities for experiencing audio, visual and tactile media in distributed and situated forms, disrupting modes of media experience that have been relatively stable for decades. We present the Distributed Interactive Audio Devices (DIADs) project, a set of experimental interventions to explore future ubiquitous computing design spaces in which electronic sound is presented as distributed, interactive and portable. The DIAD system is intended for creative sound and music performance and interaction, yet it does not conform to traditional concepts of musical performance, suggesting instead a fusion of music performance and other forms of collaborative digital interaction. We describe the thinking behind the project, the state of the DIAD system’s technical development, and our experiences working with userinteraction in lab-based and public performance scenarios.
Interaction Models for Real-time Participatory Musical Performance using Mobile Devices
2017
With the availability of mobile devices, numerous works have emerged that invite the audience to participate in musical performances in real time. In this paper we review a number of different works with the intention to formulate and analyse interaction models used in real-time participatory musical performances. We identify audience participation forms and performance models as two main viewpoints for discussing interaction models. Furthermore we indicate several potential directions for an engaging and ongoing interactive dialogue. We conclude with an interest in the audience’s direct involvement in combination with a responsive form of interaction, for further research and development in the field of participatory musical performance using mobile devices.
Reflections on Aspects of Music Interactivity in Performance Situations
Music interactivity is a sub-field of human-computer interaction studies. Interactive situations have different degree of structural openness and musical “ludicity” or playfulness. Discussing music seems inherently impossible since it is essentially a non-verbal activity. Music can produce an understanding (or at least prepare for an understanding) of creativity that is of an order neither verbal nor written. A human listener might perceive beauty to be of this kind in a particular music. But can machine-generated music be considered creative and if so, wherein lies the creativity? What are the conceptual limits of notions such as instrument, computer and machine? A work of interactive music might be more pertinently described by the processes involved than by one or several instanciations. While humans spontaneously deal with multiple process descriptions (verbal, visual, kinetic…) and are very good at synthesising, the computer is limited to handling processes describable in a formal language such as computer code. But if the code can be considered a score, does it not make a musician out of the computer? As tools for creative stimulus, composers have created musical systems employing artificial intelligence in different forms since the dawn of computer music. A large part of music interactivity research concerns interface design, which involves ergonomics and traditional instrument maker concepts. I will show examples of how I work with interactivity in my compositions, from straight-forward applications as composition tools to more complex artistic work.