Pierre-Henri Vulliard - Academia.edu (original) (raw)
Uploads
Papers by Pierre-Henri Vulliard
Le Centre pour la Communication Scientifique Directe - HAL - memSIC, 2012
Il est un fait avéré que l'écoute de musique induit des réactions physiologiques particulières ch... more Il est un fait avéré que l'écoute de musique induit des réactions physiologiques particulières chez l'auditeur, et l'étude de ces inductions constitue un terrain d'études encore vaste. Lorsque l'on veut analyser les signaux physiologiques mesurés sur une personne écoutant de la musique, il faut définir des modèles pour savoir quelles informations rechercher dans ces signaux. Inversement, lorsque l'on cherche à générer de la musique à partir de signaux physiologiques échantillonnés, on cherche en fait à créer une induction inverse de celle qui a lieu naturellement, et il faut pour cela définir des modèles afin d'être capable de contrôler tous les paramètres d'un système de musique générative à partir des quelques signaux physiologiques à disposition, et ce de manière cohérente. La notion d'émotion, en plus de sembler toute indiquée dans le contexte, se révèle être une notion pivot très pratique pour faire correspondre des modèles musicaux avec des modèles physiologiques. Nous proposons dans cet article un système temps-réel expérimental visant à étudier les interactions et rétroactions entre musique et physiologie, basé sur le paradigme des émotions.
The augmented ballet project aims at gathering research from several fields and directing them to... more The augmented ballet project aims at gathering research from several fields and directing them towards a same application case: adding virtual elements (visual and acoustic) to a dance live performance, and allowing the dancer to interact with them. In this paper, we describe a novel interaction that we used in the frame of this project: using the dancer's movements to recognize the emotions he expresses, and use these emotions to generate musical audio flows evolving in real-time. The originality of this interaction is threefold. First, it covers the whole interaction cycle from the input (the dancer's movements) to the output (the generated music). Second, this interaction isn't direct but goes through a high level of abstraction: dancer's emotional expression is recognized and is the source of music generation. Third, this interaction has been designed and validated through constant collaboration with a choreographer, culminating in an augmented ballet performance...
Our aim is to fill the gap between computer music and musical emotions. We use a 2D emotional spa... more Our aim is to fill the gap between computer music and musical emotions. We use a 2D emotional space (2D ES) both to analyze listener's emotions captured from physiological measure and to drive a music generator with a Reinforcement Learning algorithm. We present hereafter the Reinforcement Learning algorithm and the different parts of our framework used as input (emotions classification with Emotiv EEG headset and Kirlian photography device, to improve the machine learning by using data that have no correlation between them), and output of the algorithm (rhythm and scale/chords generation). Some current and future applications are also briefly depicted.
Guide to Brain-Computer Music Interfacing, 2014
It is a well-known fact that listening to music produces particular physiological reactions for t... more It is a well-known fact that listening to music produces particular physiological reactions for the auditor, and the study of these relationships remains a wide unexplored field of study. When one starts analyzing physiological signals measured on a person listening to music, one has to firstly define models to know what information could be observed with these signals. Conversely, when one starts trying to generate some music from physiological data, in fact, it is an attempt to create the inverse relationship of the one happening naturally, and in order to do that, one also has to define models enabling the control of all the parameters of a generative music system from the few physiological information available, and in a coherent way. The notion of emotion, aside from looking particularly appropriate in the context, reveals itself to be a central concept allowing the articulation between musical and physiological models. We suggest in this article an experimental real-time system aiming at studying the interactions and retroactions between music and physiology, based on the paradigm of emotions.
Le Centre pour la Communication Scientifique Directe - HAL - Diderot, Jul 8, 2021
Le Centre pour la Communication Scientifique Directe - HAL - memSIC, 2012
Il est un fait avéré que l'écoute de musique induit des réactions physiologiques particulières ch... more Il est un fait avéré que l'écoute de musique induit des réactions physiologiques particulières chez l'auditeur, et l'étude de ces inductions constitue un terrain d'études encore vaste. Lorsque l'on veut analyser les signaux physiologiques mesurés sur une personne écoutant de la musique, il faut définir des modèles pour savoir quelles informations rechercher dans ces signaux. Inversement, lorsque l'on cherche à générer de la musique à partir de signaux physiologiques échantillonnés, on cherche en fait à créer une induction inverse de celle qui a lieu naturellement, et il faut pour cela définir des modèles afin d'être capable de contrôler tous les paramètres d'un système de musique générative à partir des quelques signaux physiologiques à disposition, et ce de manière cohérente. La notion d'émotion, en plus de sembler toute indiquée dans le contexte, se révèle être une notion pivot très pratique pour faire correspondre des modèles musicaux avec des modèles physiologiques. Nous proposons dans cet article un système temps-réel expérimental visant à étudier les interactions et rétroactions entre musique et physiologie, basé sur le paradigme des émotions.
The augmented ballet project aims at gathering research from several fields and directing them to... more The augmented ballet project aims at gathering research from several fields and directing them towards a same application case: adding virtual elements (visual and acoustic) to a dance live performance, and allowing the dancer to interact with them. In this paper, we describe a novel interaction that we used in the frame of this project: using the dancer's movements to recognize the emotions he expresses, and use these emotions to generate musical audio flows evolving in real-time. The originality of this interaction is threefold. First, it covers the whole interaction cycle from the input (the dancer's movements) to the output (the generated music). Second, this interaction isn't direct but goes through a high level of abstraction: dancer's emotional expression is recognized and is the source of music generation. Third, this interaction has been designed and validated through constant collaboration with a choreographer, culminating in an augmented ballet performance...
Our aim is to fill the gap between computer music and musical emotions. We use a 2D emotional spa... more Our aim is to fill the gap between computer music and musical emotions. We use a 2D emotional space (2D ES) both to analyze listener's emotions captured from physiological measure and to drive a music generator with a Reinforcement Learning algorithm. We present hereafter the Reinforcement Learning algorithm and the different parts of our framework used as input (emotions classification with Emotiv EEG headset and Kirlian photography device, to improve the machine learning by using data that have no correlation between them), and output of the algorithm (rhythm and scale/chords generation). Some current and future applications are also briefly depicted.
Guide to Brain-Computer Music Interfacing, 2014
It is a well-known fact that listening to music produces particular physiological reactions for t... more It is a well-known fact that listening to music produces particular physiological reactions for the auditor, and the study of these relationships remains a wide unexplored field of study. When one starts analyzing physiological signals measured on a person listening to music, one has to firstly define models to know what information could be observed with these signals. Conversely, when one starts trying to generate some music from physiological data, in fact, it is an attempt to create the inverse relationship of the one happening naturally, and in order to do that, one also has to define models enabling the control of all the parameters of a generative music system from the few physiological information available, and in a coherent way. The notion of emotion, aside from looking particularly appropriate in the context, reveals itself to be a central concept allowing the articulation between musical and physiological models. We suggest in this article an experimental real-time system aiming at studying the interactions and retroactions between music and physiology, based on the paradigm of emotions.
Le Centre pour la Communication Scientifique Directe - HAL - Diderot, Jul 8, 2021