William Marley | University of Limerick (original) (raw)

Papers by William Marley

Research paper thumbnail of Processing an Instrumental Gesture in the Reactable as a Method of Computer Improvisation

Musical improvisation is one of the most widely practised forms of composition and performance. I... more Musical improvisation is one of the most widely practised forms of composition and performance. It is evident in most musical cultures and can be traced back to some of the earliest artistic and religious musical practices. It is a style that incorporates such concepts as change, adjustment, experimentation and ornamentation. Musical improvisation can demand cohesion and participation, a communicative and cooperative relationship between performers. We propose a method of capturing and reshaping an instrumental gesture on the Reactable, with the aim of establishing a computer improvisation system that will respond to the performer with variations on their contributed, physical gesture. A method of sampling this gesture, specifically the physical rotation of a tangible object, is detailed. We look to techniques used in Computer Aided Algorithmic Composition (CAAC) for a practical approach to data generation from a given sampled gesture in real-time. A stochastic procedure is designat...

Research paper thumbnail of Tightly Coupled Agents in Live Performance Metacreations

Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition - C&C '15, 2015

We consider how the application of AI in digital musical instruments might maximally support expl... more We consider how the application of AI in digital musical instruments might maximally support exploration of sound in performance. Live performance applications of AI and machine learning have tended to focus on score following and the development of machine collaborators. In our work we are interested in exploring the development of systems whereby the human performer interacts with a reactive and creative agent in the creation of a single sonic output. The intention is to design systems that foster exploration and allow for greater (than with acoustic instruments) opportunities for serendipitous musical encounters. An initial approach to the integration of autonomous agency, based on gesture reshaping schemes within the Reactable performance system, is first outlined. We then describe a simple platform based on the non-player characters within Pacman, which serves as a test bed for guiding further discussion on what musical machine collaboration at this level may entail. Pilot studies for both systems are outlined.

Research paper thumbnail of Gestroviser: toward collaborative agency in digital musical instruments

This paper describes a software extension to the Reactable entitled Gestroviser that was develope... more This paper describes a software extension to the Reactable entitled Gestroviser that was developed to explore musician machine collaboration at the control signal level. The system functions by sampling a performers input, processing or reshaping this sampled input, and then repeatedly replaying it. The degree to which the sampled control signal is processed during replay is adjustable in real-time by the manipulation of a continuous finger slider function. The reshaping algorithm uses stochastic methods commonly used for MIDI note generation from a provided dataset. The reshaped signal therefore varies in an unpredictable manner. In this way the Gestroviser is a device to capture, reshape and replay an instrumental gesture. We describe the result of initial user testing of the system and discuss possible further development.

Research paper thumbnail of Tightly Coupled Agents in Live Performance Metacreations

We consider how the application of AI in digital musical instruments might maximally support expl... more We consider how the application of AI in digital musical instruments might maximally support exploration of sound in performance. Live performance applications of AI and machine learning have tended to focus on score following and the development of machine collaborators. In our work we are interested in exploring the development of systems whereby the human performer interacts with a reactive and creative agent in the creation of a single sonic output. The intention is to design systems that foster exploration and allow for greater (than with acoustic instruments) opportunities for serendipitous musical encounters. An initial approach to the integration of autonomous agency, based on gesture reshaping schemes within the Reactable performance system, is first outlined. We then describe a simple platform based on the non-player characters within Pacman, which serves as a test bed for guiding further discussion on what musical machine collaboration at this level may entail. Pilot studies for both systems are outlined.

Research paper thumbnail of Gestroviser: Toward Collaborative Agency in Digital Musical Instruments.

This paper describes a software extension to the Reactable entitled Gestroviser that was develope... more This paper describes a software extension to the Reactable entitled Gestroviser that was developed to explore musician machine collaboration at the control signal level. The system functions by sampling a performers input, processing or reshaping this sampled input, and then repeatedly replaying it. The degree to which the sampled control signal is processed during replay is adjustable in real-time by the manipulation of a continuous finger slider function. The reshaping algorithm uses stochastic methods commonly used for MIDI note generation from a provided dataset. The reshaped signal therefore varies in an unpredictable manner. In this way the Gestroviser is a device to capture, reshape and replay an instrumental gesture. We describe the result of initial user testing of the system and discuss possible further development.

Research paper thumbnail of Processing an Instrumental Gesture in the Reactable as a Method of Computer Improvisation

Musical improvisation is one of the most widely practised forms of composition and performance. I... more Musical improvisation is one of the most widely practised forms of composition and performance. It is evident in most musical cultures and can be traced back to some of the earliest artistic and religious musical practices. It is a style that incorporates such concepts as change, adjustment, experimentation and ornamentation. Musical improvisation can demand cohesion and participation, a communicative and cooperative relationship between performers. We propose a method of capturing and reshaping an instrumental gesture on the Reactable, with the aim of establishing a computer improvisation system that will respond to the performer with variations on their contributed, physical gesture. A method of sampling this gesture, specifically the physical rotation of a tangible object, is detailed. We look to techniques used in Computer Aided Algorithmic Composition (CAAC) for a practical approach to data generation from a given sampled gesture in real-time. A stochastic procedure is designat...

Research paper thumbnail of Tightly Coupled Agents in Live Performance Metacreations

Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition - C&C '15, 2015

We consider how the application of AI in digital musical instruments might maximally support expl... more We consider how the application of AI in digital musical instruments might maximally support exploration of sound in performance. Live performance applications of AI and machine learning have tended to focus on score following and the development of machine collaborators. In our work we are interested in exploring the development of systems whereby the human performer interacts with a reactive and creative agent in the creation of a single sonic output. The intention is to design systems that foster exploration and allow for greater (than with acoustic instruments) opportunities for serendipitous musical encounters. An initial approach to the integration of autonomous agency, based on gesture reshaping schemes within the Reactable performance system, is first outlined. We then describe a simple platform based on the non-player characters within Pacman, which serves as a test bed for guiding further discussion on what musical machine collaboration at this level may entail. Pilot studies for both systems are outlined.

Research paper thumbnail of Gestroviser: toward collaborative agency in digital musical instruments

This paper describes a software extension to the Reactable entitled Gestroviser that was develope... more This paper describes a software extension to the Reactable entitled Gestroviser that was developed to explore musician machine collaboration at the control signal level. The system functions by sampling a performers input, processing or reshaping this sampled input, and then repeatedly replaying it. The degree to which the sampled control signal is processed during replay is adjustable in real-time by the manipulation of a continuous finger slider function. The reshaping algorithm uses stochastic methods commonly used for MIDI note generation from a provided dataset. The reshaped signal therefore varies in an unpredictable manner. In this way the Gestroviser is a device to capture, reshape and replay an instrumental gesture. We describe the result of initial user testing of the system and discuss possible further development.

Research paper thumbnail of Tightly Coupled Agents in Live Performance Metacreations

We consider how the application of AI in digital musical instruments might maximally support expl... more We consider how the application of AI in digital musical instruments might maximally support exploration of sound in performance. Live performance applications of AI and machine learning have tended to focus on score following and the development of machine collaborators. In our work we are interested in exploring the development of systems whereby the human performer interacts with a reactive and creative agent in the creation of a single sonic output. The intention is to design systems that foster exploration and allow for greater (than with acoustic instruments) opportunities for serendipitous musical encounters. An initial approach to the integration of autonomous agency, based on gesture reshaping schemes within the Reactable performance system, is first outlined. We then describe a simple platform based on the non-player characters within Pacman, which serves as a test bed for guiding further discussion on what musical machine collaboration at this level may entail. Pilot studies for both systems are outlined.

Research paper thumbnail of Gestroviser: Toward Collaborative Agency in Digital Musical Instruments.

This paper describes a software extension to the Reactable entitled Gestroviser that was develope... more This paper describes a software extension to the Reactable entitled Gestroviser that was developed to explore musician machine collaboration at the control signal level. The system functions by sampling a performers input, processing or reshaping this sampled input, and then repeatedly replaying it. The degree to which the sampled control signal is processed during replay is adjustable in real-time by the manipulation of a continuous finger slider function. The reshaping algorithm uses stochastic methods commonly used for MIDI note generation from a provided dataset. The reshaped signal therefore varies in an unpredictable manner. In this way the Gestroviser is a device to capture, reshape and replay an instrumental gesture. We describe the result of initial user testing of the system and discuss possible further development.