On human-AI collaboration in artistic performance (original) (raw)
Related papers
Stimulating Creative Partnerships in Human-Agent Musical Interaction
2016
Musical duets are a type of creative partnership with a long history of artistic practice. What can they tell us about creative partnerships between a human and a computer? To explore this question we implemented an activity-based model of duet interaction in software designed to support musical metacreation and investigated the experience of performing with it. The activity-based model allowed for the application of reflexive interactive processes, previously used in dialogic interaction, to a synchronous musical performance context. The experience of improvising with the computational agent was evaluated by expert musicians, who reported that interactions were fun, engaging, and challenging, despite some obvious limitations in the musical sophistication of the software. These findings reinforce the idea that even simple metacreative systems can stimulate creative partnerships and, further, that creative human-machine duet partnerships may well produce, like human-human duet partnerships, more than the sum of their parts. 1. INTRODUCTION This article investigates human-computer creative partnerships in the context of musical duet performance. We describe a computational music agent, CIM, designed for duet performance with a human musician, and evaluate the effectiveness of the system at stimulating an engaging musical interaction and engendering a sense of human-computer collaboration. We view human-computer creative partnerships involving musical metacreation as a particular kind of human-computer interaction, in which the computer has a degree of agency, and the phenomenological experience of the interaction includes elements of partnership, cooperation, and negotiation [Jones et al. 2012]; contrasting with instrumental approaches to creativity support systems, where the computer is functioning as a tool [Shneiderman et al. 2006]. CIM utilises an activity-based model of interaction, where the musical outputs of both performers (human and computer) are categorised into a few discrete activities, according to the relationship between the current output and previous output from either performer. This facilitates description of temporal structure in the performance, representing inter-part and intra-part relationships separately from the representation of surface musical content. In order to investigate the experience of metacreative musical duets, we collated and compared subjective impressions of experienced human musicians interacting with CIM. We employed a mixed quantitative/qualitative survey instrument probing the performers' experience of interaction. Through analysis of this data we identified aspects of CIM's behaviour that influenced its effectiveness as a musical collaborator, from which we extrapolate to suggest interaction approaches that may foster human-computer creative partnerships more broadly. The results of the evaluation suggested that the system was effective at stimulating an engaging musical collaboration. Striking a balance between unexpectedness and predictability emerged as a key factor, as might be expected. The combination of a reflexive approach with an activity-based interaction model appears to have been an effective platform for mediating these opposing tendencies, allowing musically meaningful engagement with the system, despite its musical 'knowledge' being quite limited.
A Robot Musician Interacting with a Human Partner through Initiative Exchange
This paper proposes a novel method to realize an initiative exchange for robot. A humanoid robot plays vibraphone exchanging initiative with a human performer by perceiving multimodal cues in real time. It understands the initiative exchange cues through vision and audio information. In order to achieve the natural initiative exchange between a human and a robot in musical performance, we built the system and the software architecture and carried out the experiments for fundamental algorithms which are necessary to the initiative exchange.
Human-AI Musicking: A Framework for Designing AI for Music Co-creativity
Zenodo (CERN European Organization for Nuclear Research), 2023
In this paper, we present a framework for understanding human-AI musicking. This framework prompts a series of questions for reflecting on various aspects of the creative interrelationships between musicians and AI and thus can be used as a tool for designing creative AI systems for music. AI is increasingly being utilised in sonic arts and music performance, as well as digital musical instrument design. Existing works generally focus on the theoretical and technical considerations needed to design such systems. Our framework adds to this corpus by employing a bottom-up approach, as such it is built using an embodied and phenomenological perspective. With our framework, we put forward a tool that can be used to design, develop, and deploy creative AI in ways that are meaningful to musicians, from the perspective of musicking (doing music). Following a detailed introduction to the framework, we then introduce the four case studies that were used to refine and validate it, namely, a breathing guitar, a biosensing director AI, a folk-melody generator, and a realtime co-creative robotic score. Each of these is at different stages of development, ranging from ideation, through prototyping, into refinement, and finally, evaluation. Additionally, each design case also presents a distinct mode of interaction based on a continuum of human-AI interaction, which ranges from creation tool to co-creative agent. We then present reflection points based on our evaluation of using, challenging, and testing the framework with active projects. Our findings warrant future widespread application of this framework in the wild.
Human-Computer Music Performance: From Synchronized Accompaniment to Musical Partner
2013
Live music performance with computers has motivated many research projects in science, engineering, and the arts. In spite of decades of work, it is surprising that there is not more technology for, and a better understanding of the computer as music performer. We review the development of techniques for live music performance and outline our efforts to establish a new direction, Human-Computer Music Performance (HCMP), as a framework for a variety of coordinated studies. Our work in this area spans performance analysis, synchronization techniques, and interactive performance systems. Our goal is to enable musicians to ncorporate computers into performances easily and effectively through a better understanding of requirements, new techniques, and practical, performance-worthy implementations. We conclude with directions for future work.
Ensemble music is a familiar example of collective creativity. However, it remains unclear how the process of changes from individual creativity (i.e. solo performance) leads to collective creativity (i.e. ensemble performance) and how we can simulate this process. Thus, this study investigated whether a multi-agent system can sufficiently simulate human ensemble performances using recorded solo performances by skilled professional performers. The original solo performance of a professional human performer was assigned to an agent as an initial value, and a simulation was conducted. Focusing on the timing profile, which is crucial for both expression and coordination during music performances, we compared the human music ensemble performances with the multi-agent ensemble performances. The main findings are as follows: (1) the multi-agent performance shares many common aspects with the human ensemble performance; (2) at the structural boundaries of the musical piece, where the tempo changes drastically, the difference between the human performance and the multi-agent performance became significant; (3) by making the agent's negotiation strategy similar to that of a human, the similarities of the multi-agent and human performances increased. These results can be used to increase our understanding of collective creativities, the creation of new musical performances, and human-agent collaboration.
Harmony in Synthesis: Exploring Human - AI Collaboration in Music
IRJCS:: AM Publications,India, 2024
The nexus between artificial intelligence (AI) and human creativity offers a fascinating paradigm shift in the dynamic field of music composition. To understand the effects on musical composition, production, and performance, this research study, "Harmony in Synthesis: Exploring Human-AI Collaboration in Music," explores the dynamic interplay between human artists and AI systems.The opening establishes the scene by describing the development of AI in the music business and emphasizing the revolutionary possibilities of teamwork. This study addresses current knowledge through a thorough literature assessment, pointing out gaps that our research aims to remedy and adding to the conversation on AI's involvement in creative processes.
Robotic Musicianship–Musical Interactions Between Humans and Machines
2007
The Robotic Musicianship project aims to facilitate meaningful musical interactions between humans and machines, leading to novel musical experiences and outcomes. The project combines computational modelling of music perception, interaction, and improvisation, with the capacity to produce acoustic responses in physical and visual manners.
Dance and Artificial Intelligence: Using or collaborating
Human+AI collaborative performance conference, 2022
Conference report of the talk 'Human-AI Dance: Hybrid co-creativity in the posthuman era'. The talk challenges previous approaches to the creative collaboration between humans and artificial intelligence, questioning whether we are only using AI to enhance our own creativity rather than performing collaboration between two creative agents. It also reflects on the role of bodily-kinesthetic intelligence and interactive AI design.
How Artificial Intelligence Can Shape Choreography The Significance of Techno-Performance
Perfromance Paradigm, 2022
Recent choreographic works using artificial intelligence (AI) have focused on the motions designed by AI based on vast amounts of data about human bodily movements. However, complications arise when the emphasis shifts to the technological determinism that manifests in creative processes mutually produced by AI and choreographers or dancers because the value system of AI engineers diverges from that of the choreographers or dancers who use AI. Emphasising the creative process may clash with the aspirations AI engineers have for the new technology. With these concerns in mind, this essay aims to clarify the creative process shared by the AI engineers and the choreographers, dancers and techno-performances who use the technology. This study illuminates the development of “Beethoven Complex” (2020), an AI Beethoven program featuring a choreographic performance that utilises an AI-driven automatic music composition system.
Collaborative dance between robot and human
2016
Dance is an inherently embodied activity. The dancer is attuned to the effects of the physical world on her own physicality and the relationship of her presence to other dancers. This research is an investigation into artificially intelligent performing agents and robots and how a human dancer can guide the learning and performance of a robot performer. Using Artificial Neural Networks as the bases for the agent’s computational intelligence, performing agents were created that can perform by collaborating with human dancers through robots.