Jon Bellona | University of Oregon (original) (raw)
Conference Presentations by Jon Bellona
In this paper, I describe a Max/MSP interface series (Kinect-Via-) for composers wanting to route... more In this paper, I describe a Max/MSP interface series (Kinect-Via-) for composers wanting to route and map user-tracking data from the XBox Kinect. The interface series complements four different OpenNI applications, namely OSCeleton, Synapse, Processing's simple-openni library, and Delicode's NIMate. All Max/MSP interfaces communicate using OSC (Open Sound Control) messages and are performance-ready, meaning that all routing and system options may be changed in real time. The Kinect-Via-interfaces offer a tangible solution for anyone wishing to explore user tracking with the Kinect for creative application. The aim of the paper is to discuss features of four different OpenNI applications, to address potential issues and challenges when working with the OpenNI framework, and to outline formative interface issues revolving around video tracking technology.
Papers by Jon Bellona
The 25th International ACM SIGACCESS Conference on Computers and Accessibility, 2023
Exploring the potential of data sonification, our study focuses on enhancing accessibility to oce... more Exploring the potential of data sonification, our study focuses on enhancing accessibility to oceanographic data for blind and low vision (BLV) students in museums. Five auditory display prototypes with accompanying data sonifications were designed, and two prototypes underwent evaluation by testing with blind and low vision students from two schools for the blind. Through interviews and qualitative analysis, the study assessed the students' learning outcomes and their experience on the auditory display prototypes. The results demonstrate the effective conveyance of oceanography concepts and data through the developed audio display prototypes. Additionally, the study derived several design implications for future oceanographic data sonification design in museums. This research contributes to advancing the accessibility of oceanography education for visually impaired students while providing valuable insights for the design of future auditory displays in informal learning environments (ILE).
Proceedings of the 23rd International Conference on Auditory Display - ICAD 2017, 2017
Since people often communicate internal states and intentions through movement, robots can better... more Since people often communicate internal states and intentions through movement, robots can better interact with humans if they too can modify their movements to communicate changing state. These movements, which may be seen as supplementary to those required for workspace tasks, may be termed “expressive.” However, robot hardware, which cannot recreate the same range of dynamics as human limbs, often limit expressive capacity. One solution is to augment expressive robotic movement with expressive sound. To that end, this paper presents an application for synthesizing sounds that match various movement qualities. Its design is based on an empirical study analyzing sound and movement qualities, where movement qualities are parametrized according to Laban’s Effort System. Our results suggests a number of correspondences between movement qualities and sound qualities. These correspondences are presented here and discussed within the context of designing movement-quality-to-sound-quality...
Advancements in digital computing have allowed the amplification of the organic body as musical i... more Advancements in digital computing have allowed the amplification of the organic body as musical instrument. This dissertation explores human-computer musical interfaces (i.e., alternate controllers) through the view of the body and describes music composition using digital technology through physical movement. The dissertation aims to 1) develop a framework for body–digital technology–sound discourse and 2) describe a collection of software tools, digital musical instruments, and music that demonstrates this framework. Chapter 1 frames alternate controller digital musical instruments around the body, and Chapter 2 looks at physical movement through its numerical representation inside the computer, discussing how the digital byproducts of movement help shape our musical choices. Chapter 3 describes software tools for aiding digital sound composition on digital musical instruments. Chapter 4 outlines a digital musical instrument composition and performance practice, and Chapter 5 buil...
Organised Sound, 2017
Through a musical examination of The Hands (Movement 1) (1986), this article reveals musical deve... more Through a musical examination of The Hands (Movement 1) (1986), this article reveals musical developments of Michel Waisvisz’s early performance practice with The Hands. Waisvisz’s digital musical instrument, The Hands, helped shape the body as a recognisable agent of musical structure; this article details how the instrument controls relate to musical gesture and reveals the physical efforts of the composer through spectral analyses of the sound recording and an investigation of a 1987 performance video. The rich context of The Hands – the instrument controls, Waisvisz’s decades-long dedication to the instrument’s potential and, more centrally, the exploration of human effort in performance – have made this pioneering live-performance system for electronic music stand for more than just a musical instrument. Many authors cite Michel Waisvisz’s Hands technology but few provide musical analyses of Waisvisz’s music. The musical analysis of The Hands (Movement 1) (1986) highlights the ...
Sound Pong is an electronic ensemble composition for four performers using four Wii-motes and fou... more Sound Pong is an electronic ensemble composition for four performers using four Wii-motes and four pairs of RecSpecs. The eight-channel work takes an historical look at the gaming experience through the use of modern controllers set inside a classic 8-bit aesthetic. Similar to the early video game Pong, or Robert Rauschenberg’s Open Score, an object is hit between players from in and around a dictated space. The sound field outlines the audience space and, by placing the performers within this space, helps to fuel audience interaction. A game-like interface projects onto the front wall, fusing both audience and performer spaces together, while simultaneously augmenting the audience’s interactive sensory experience.
Proceedings of the 4th International Conference on Movement Computing, 2017
Since people communicate intentions and inner states through movement, robots can better interact... more Since people communicate intentions and inner states through movement, robots can better interact with humans if they too can modify their movements to communicate changing state. These movements, which may be seen as supplementary to those required for workspace tasks, may be termed "expressive." However, robot hardware, which cannot recreate the same range of dynamics as human limbs, often limit expressive capacity. One solution is to augment expressive robotic movement with expressive sound. To that end, this paper presents a study to find a qualitative mapping between movement and sound. Musicians were asked to vocalize sounds in response to animations of a simple simulated upper body movement performed with different movement qualities, parametrized according to Laban's Effort System. Qualitative labelling and quantitative signal analysis of these sounds suggests a number of correspondences between movement qualities and sound qualities. These correspondences are ...
Robots that need to operate in human-facing environments will need complex options for modifying ... more Robots that need to operate in human-facing environments will need complex options for modifying their movement to communicate changing state information. To that end, this paper builds on a method for creating variation in robotic movement and presents methods for improving this variation. In particular, a method using optimal control to modify the quality of robotic movements is updated to include generated sounds and a more expressive reference trajectory. The experimental design for finding the qualitative mapping between movement and sound, which leveraged expert musicians, and the experiment results are presented. Based on feedback from experiment participants, an improved method for generating movement variation is proposed. This method generates enhanced variations in robotic movement trajectories using the affinities between Effort and Space as described in Laban/Bartenieff Movement Studies (LBMS) to further improve people’s perception of the qualities of robotic movement.
Proceedings of the International Conference on Auditory Display (ICAD), 2017
Since people often communicate internal states and intentions through movement, robots can better... more Since people often communicate internal states and intentions through movement, robots can better interact with humans if they too can modify their movements to communicate changing state. These movements, which may be seen as supplementary to those required for workspace tasks, may be termed “expressive.” However, robot hardware, which cannot recreate the same range of dynamics as human limbs, often limit expressive capacity. One solution is to augment expressive robotic movement with expressive sound.
To that end, this paper presents an application for synthesizing sounds that match various movement qualities. Its design is based on an empirical study analyzing sound and movement qualities, where movement qualities are parametrized according to Laban’s Effort System. Our results suggests a number of correspondences between movement qualities and sound qualities. These correspondences are presented here and discussed within the context of designing movement-quality-to-sound-quality mappings in our sound synthesis application. This application will be used in future work testing user perceptions of expressive movements with synchronous sounds.
Through a musical examination of The Hands (Movement 1) (1986), this article reveals musical deve... more Through a musical examination of The Hands (Movement 1) (1986), this article reveals musical developments of Michel Waisvisz’s early performance practice with The Hands. Waisvisz’s digital musical instrument, The Hands, helped shape the body as a recognisable agent of musical structure; this article details how the instrument controls relate to musical gesture and reveals the physical efforts of the composer through spectral analyses of the sound recording and an investigation of a 1987 performance video. The rich context of The Hands – the instrument controls, Waisvisz’s decades-long dedication to the instrument’s potential and, more centrally, the exploration of human effort in performance – have made this pioneering live-performance system for electronic music stand for more than just a musical instrument. Many authors cite Michel Waisvisz’s Hands technology but few provide musical analyses of Waisvisz’s music. The musical analysis of The Hands (Movement 1) (1986) highlights the art practice of a seminal composer–performer through one of his most important digital musical instruments and signals how the body became an instrument for playing digital electronic music.
In this paper, I present a computer program and compositional tool, Sonic Dog Tags, that retrieve... more In this paper, I present a computer program and compositional tool, Sonic Dog Tags, that retrieves biographical information of fallen service members from the Department of Defense RSS feed and maps this information to create memorial music. The aim is to discuss the methods and compositional mapping strategies for creating individual, expressive, musical works. Both information retrieval and mapping strategies are discussed, while an integrative system for future compositions is proposed.
Thesis Chapters by Jon Bellona
Advancements in digital computing have allowed the amplification of the organic body as musical i... more Advancements in digital computing have allowed the amplification of the organic body as musical instrument. This dissertation explores human-computer musical interfaces (i.e., alternate controllers) through the view of the body and describes music composition using digital technology through physical movement. The dissertation aims to 1) develop a framework for body–digital technology–sound discourse and 2) describe a collection of software tools, digital musical instruments, and music that demonstrates this framework.
Chapter 1 frames alternate controller digital musical instruments around the body, and Chapter 2 looks at physical movement through its numerical representation inside the computer, discussing how the digital byproducts of movement help shape our musical choices. Chapter 3 describes software tools for aiding digital sound composition on digital musical instruments. Chapter 4 outlines a digital musical instrument composition and performance practice, and Chapter 5 builds from Baruch Spinoza, Shaun Gallagher, and Mark Johnson to dissolve mind and body divides in the act of composition on digital musical instruments. Chapter 6 further explores how the rich palette of body movement can become a musical voice with a report on a new musical instrument, Distance-X. The final chapter revisits a concept central to digital musical instruments, the movement of data, in order to describe how pre-existing digital information can be set into musical motion.
Working from the embrace of physical bodies inside our digital lens can help acknowledge the complex and different stories that make-up our musical communities. Physical Composition strives to address the concerns and the capacities of these bodies and endeavors to stand as a model for achieving symbiosis between performer and digital sound. This dissertation seeks to affirm physical resistance in a digital music practice and depict how Physical Composition is both an art of the moving body and an art of composing sound.
In this paper, I describe a Max/MSP interface series (Kinect-Via-) for composers wanting to route... more In this paper, I describe a Max/MSP interface series (Kinect-Via-) for composers wanting to route and map user-tracking data from the XBox Kinect. The interface series complements four different OpenNI applications, namely OSCeleton, Synapse, Processing's simple-openni library, and Delicode's NIMate. All Max/MSP interfaces communicate using OSC (Open Sound Control) messages and are performance-ready, meaning that all routing and system options may be changed in real time. The Kinect-Via-interfaces offer a tangible solution for anyone wishing to explore user tracking with the Kinect for creative application. The aim of the paper is to discuss features of four different OpenNI applications, to address potential issues and challenges when working with the OpenNI framework, and to outline formative interface issues revolving around video tracking technology.
The 25th International ACM SIGACCESS Conference on Computers and Accessibility, 2023
Exploring the potential of data sonification, our study focuses on enhancing accessibility to oce... more Exploring the potential of data sonification, our study focuses on enhancing accessibility to oceanographic data for blind and low vision (BLV) students in museums. Five auditory display prototypes with accompanying data sonifications were designed, and two prototypes underwent evaluation by testing with blind and low vision students from two schools for the blind. Through interviews and qualitative analysis, the study assessed the students' learning outcomes and their experience on the auditory display prototypes. The results demonstrate the effective conveyance of oceanography concepts and data through the developed audio display prototypes. Additionally, the study derived several design implications for future oceanographic data sonification design in museums. This research contributes to advancing the accessibility of oceanography education for visually impaired students while providing valuable insights for the design of future auditory displays in informal learning environments (ILE).
Proceedings of the 23rd International Conference on Auditory Display - ICAD 2017, 2017
Since people often communicate internal states and intentions through movement, robots can better... more Since people often communicate internal states and intentions through movement, robots can better interact with humans if they too can modify their movements to communicate changing state. These movements, which may be seen as supplementary to those required for workspace tasks, may be termed “expressive.” However, robot hardware, which cannot recreate the same range of dynamics as human limbs, often limit expressive capacity. One solution is to augment expressive robotic movement with expressive sound. To that end, this paper presents an application for synthesizing sounds that match various movement qualities. Its design is based on an empirical study analyzing sound and movement qualities, where movement qualities are parametrized according to Laban’s Effort System. Our results suggests a number of correspondences between movement qualities and sound qualities. These correspondences are presented here and discussed within the context of designing movement-quality-to-sound-quality...
Advancements in digital computing have allowed the amplification of the organic body as musical i... more Advancements in digital computing have allowed the amplification of the organic body as musical instrument. This dissertation explores human-computer musical interfaces (i.e., alternate controllers) through the view of the body and describes music composition using digital technology through physical movement. The dissertation aims to 1) develop a framework for body–digital technology–sound discourse and 2) describe a collection of software tools, digital musical instruments, and music that demonstrates this framework. Chapter 1 frames alternate controller digital musical instruments around the body, and Chapter 2 looks at physical movement through its numerical representation inside the computer, discussing how the digital byproducts of movement help shape our musical choices. Chapter 3 describes software tools for aiding digital sound composition on digital musical instruments. Chapter 4 outlines a digital musical instrument composition and performance practice, and Chapter 5 buil...
Organised Sound, 2017
Through a musical examination of The Hands (Movement 1) (1986), this article reveals musical deve... more Through a musical examination of The Hands (Movement 1) (1986), this article reveals musical developments of Michel Waisvisz’s early performance practice with The Hands. Waisvisz’s digital musical instrument, The Hands, helped shape the body as a recognisable agent of musical structure; this article details how the instrument controls relate to musical gesture and reveals the physical efforts of the composer through spectral analyses of the sound recording and an investigation of a 1987 performance video. The rich context of The Hands – the instrument controls, Waisvisz’s decades-long dedication to the instrument’s potential and, more centrally, the exploration of human effort in performance – have made this pioneering live-performance system for electronic music stand for more than just a musical instrument. Many authors cite Michel Waisvisz’s Hands technology but few provide musical analyses of Waisvisz’s music. The musical analysis of The Hands (Movement 1) (1986) highlights the ...
Sound Pong is an electronic ensemble composition for four performers using four Wii-motes and fou... more Sound Pong is an electronic ensemble composition for four performers using four Wii-motes and four pairs of RecSpecs. The eight-channel work takes an historical look at the gaming experience through the use of modern controllers set inside a classic 8-bit aesthetic. Similar to the early video game Pong, or Robert Rauschenberg’s Open Score, an object is hit between players from in and around a dictated space. The sound field outlines the audience space and, by placing the performers within this space, helps to fuel audience interaction. A game-like interface projects onto the front wall, fusing both audience and performer spaces together, while simultaneously augmenting the audience’s interactive sensory experience.
Proceedings of the 4th International Conference on Movement Computing, 2017
Since people communicate intentions and inner states through movement, robots can better interact... more Since people communicate intentions and inner states through movement, robots can better interact with humans if they too can modify their movements to communicate changing state. These movements, which may be seen as supplementary to those required for workspace tasks, may be termed "expressive." However, robot hardware, which cannot recreate the same range of dynamics as human limbs, often limit expressive capacity. One solution is to augment expressive robotic movement with expressive sound. To that end, this paper presents a study to find a qualitative mapping between movement and sound. Musicians were asked to vocalize sounds in response to animations of a simple simulated upper body movement performed with different movement qualities, parametrized according to Laban's Effort System. Qualitative labelling and quantitative signal analysis of these sounds suggests a number of correspondences between movement qualities and sound qualities. These correspondences are ...
Robots that need to operate in human-facing environments will need complex options for modifying ... more Robots that need to operate in human-facing environments will need complex options for modifying their movement to communicate changing state information. To that end, this paper builds on a method for creating variation in robotic movement and presents methods for improving this variation. In particular, a method using optimal control to modify the quality of robotic movements is updated to include generated sounds and a more expressive reference trajectory. The experimental design for finding the qualitative mapping between movement and sound, which leveraged expert musicians, and the experiment results are presented. Based on feedback from experiment participants, an improved method for generating movement variation is proposed. This method generates enhanced variations in robotic movement trajectories using the affinities between Effort and Space as described in Laban/Bartenieff Movement Studies (LBMS) to further improve people’s perception of the qualities of robotic movement.
Proceedings of the International Conference on Auditory Display (ICAD), 2017
Since people often communicate internal states and intentions through movement, robots can better... more Since people often communicate internal states and intentions through movement, robots can better interact with humans if they too can modify their movements to communicate changing state. These movements, which may be seen as supplementary to those required for workspace tasks, may be termed “expressive.” However, robot hardware, which cannot recreate the same range of dynamics as human limbs, often limit expressive capacity. One solution is to augment expressive robotic movement with expressive sound.
To that end, this paper presents an application for synthesizing sounds that match various movement qualities. Its design is based on an empirical study analyzing sound and movement qualities, where movement qualities are parametrized according to Laban’s Effort System. Our results suggests a number of correspondences between movement qualities and sound qualities. These correspondences are presented here and discussed within the context of designing movement-quality-to-sound-quality mappings in our sound synthesis application. This application will be used in future work testing user perceptions of expressive movements with synchronous sounds.
Through a musical examination of The Hands (Movement 1) (1986), this article reveals musical deve... more Through a musical examination of The Hands (Movement 1) (1986), this article reveals musical developments of Michel Waisvisz’s early performance practice with The Hands. Waisvisz’s digital musical instrument, The Hands, helped shape the body as a recognisable agent of musical structure; this article details how the instrument controls relate to musical gesture and reveals the physical efforts of the composer through spectral analyses of the sound recording and an investigation of a 1987 performance video. The rich context of The Hands – the instrument controls, Waisvisz’s decades-long dedication to the instrument’s potential and, more centrally, the exploration of human effort in performance – have made this pioneering live-performance system for electronic music stand for more than just a musical instrument. Many authors cite Michel Waisvisz’s Hands technology but few provide musical analyses of Waisvisz’s music. The musical analysis of The Hands (Movement 1) (1986) highlights the art practice of a seminal composer–performer through one of his most important digital musical instruments and signals how the body became an instrument for playing digital electronic music.
In this paper, I present a computer program and compositional tool, Sonic Dog Tags, that retrieve... more In this paper, I present a computer program and compositional tool, Sonic Dog Tags, that retrieves biographical information of fallen service members from the Department of Defense RSS feed and maps this information to create memorial music. The aim is to discuss the methods and compositional mapping strategies for creating individual, expressive, musical works. Both information retrieval and mapping strategies are discussed, while an integrative system for future compositions is proposed.
Advancements in digital computing have allowed the amplification of the organic body as musical i... more Advancements in digital computing have allowed the amplification of the organic body as musical instrument. This dissertation explores human-computer musical interfaces (i.e., alternate controllers) through the view of the body and describes music composition using digital technology through physical movement. The dissertation aims to 1) develop a framework for body–digital technology–sound discourse and 2) describe a collection of software tools, digital musical instruments, and music that demonstrates this framework.
Chapter 1 frames alternate controller digital musical instruments around the body, and Chapter 2 looks at physical movement through its numerical representation inside the computer, discussing how the digital byproducts of movement help shape our musical choices. Chapter 3 describes software tools for aiding digital sound composition on digital musical instruments. Chapter 4 outlines a digital musical instrument composition and performance practice, and Chapter 5 builds from Baruch Spinoza, Shaun Gallagher, and Mark Johnson to dissolve mind and body divides in the act of composition on digital musical instruments. Chapter 6 further explores how the rich palette of body movement can become a musical voice with a report on a new musical instrument, Distance-X. The final chapter revisits a concept central to digital musical instruments, the movement of data, in order to describe how pre-existing digital information can be set into musical motion.
Working from the embrace of physical bodies inside our digital lens can help acknowledge the complex and different stories that make-up our musical communities. Physical Composition strives to address the concerns and the capacities of these bodies and endeavors to stand as a model for achieving symbiosis between performer and digital sound. This dissertation seeks to affirm physical resistance in a digital music practice and depict how Physical Composition is both an art of the moving body and an art of composing sound.