Enriched Multimodal Representations of Music Performances: Online Access and Visualization (original) (raw)
Abstract
In this paper, we provide a first-person outlook on the technical challenges and developments involved in the recording, analysis, archiving, and cloud-based interchange of multimodal string quartet performance data as part of a collaborative research project on ensemble music making. In order to facilitate the sharing of our own collection of multimodal recordings and extracted descriptors and annotations, we developed a hosting platform and data archival protocol through which multimodal data (audio, video, motion capture, descriptor signals) can be stored, visualized, annotated, and selectively retrieved via a web interface and a dedicated API. By way of this paper we make a twofold contribution: (a) we open our collection of enriched multimodal datasets to the community, the Quartet Dataset ; and (b) we introduce and enable access to our multimodal data exchange platform, the Repovizz system, through which users can upload recorded data, and navigate, playback, or edit existing datasets via a standard Internet browser.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
References (22)
- References
- Schedl, M., Gómez, E., and Urbano, J. (2014). Music Information Retrieval: Recent Developments and Applications. Foundations and Trends in Information Retrieval , 8(2-3):127-261.
- Gabrielsson, A. (2003). Music performance research at the millennium. Psychology of music , 31(3):221-272.
- Godøy, R. I., and Jensenius, A. R. (2009). Body movement in Music Information Retrieval. In 10th International Society for Music Information Retrieval Conference (ISMIR 2009) (pp. 45-50).
- Jaimovich, J., Coghlan, N., and Knapp, R. B. (2012). Emotion in motion: A study of music and affective response. In From Sounds to Music and Emotions (pp. 19-43).
- Jensenius, A., Castagné, N., Camurri, A., Maestre, E., Malloch, J., and Mc Gilvray, D. (2007). A summary of formats for streaming and storing music-related movement and gesture data. In 4th International Conference on Enactive Interfaces 2007 (pp. 125-128).
- De la Torre, F., Hodgins, J., Bargteil, A., Martin, X., Macey, J., Collado, A., and Beltran, P. (2008). Guide to the carnegie mellon university multimodal activity (CMU-MMAC) database. Robotics Institute , 135.
- Perez-Carrillo, A. (2009). Enhacing spectral sintesis techniques with performance gestures using the violin as a case study. PhD thesis, Department of Information and Communication Technologies, Universitat Pompeu Fabra.
- Maestre, E. (2009). Modeling instrumental gestures: an analysis/synthesis framework for violin bowing. PhD thesis, Department of Information and Communication Technologies, Universitat Pompeu Fabra.
- Marchini, M. (2014). Analysis of Ensemble Expressive Performance in String Quartets: a Statistical and Machine Learning Approach. PhD thesis, Department of Information and Communication Technologies, Universitat Pompeu Fabra.
- Papiotis, P. (2016). A computational approach to studying interdependence in string quartet performance. PhD thesis, Department of Information and Communication Technologies, Universitat Pompeu Fabra.
- Mayor, O., Llimona, Q., Marchini, M., Papiotis, P., and Maestre, E. (2013). repoVizz: a framework for remote storage, browsing, annotation, and exchange of multi-modal data. In 21st ACM International Conference on Multimedia (pp. 415-416).
- Papiotis, P., Marchini, M., Perez-Carrillo, A., and Maestre, E. (2014). Measuring ensemble interdependence in a string quartet through analysis of multidimensional performance data. Frontiers in Psychology , 5:963.
- Marchini, M., Ramirez, R., Papiotis, P., and Maestre, E. (2014). The sense of ensemble: a machine learning approach to expressive performance modelling in string quartets. Journal of New Music Research , 43 (3), 303 -317.
- Schoonderwaldt, E. (2009). Mechanics and acoustics of violin bowing: Freedom, constraints and control in performance. PhD thesis, KTH Royal Institute of Technology.
- Llimona, Q. (2014). Bowing the violin: a case study for auditory-motor pattern modelling in the context of music performance. Bachelor's thesis, Department of Information and Communication Technologies, Universitat Pompeu Fabra.
- Input Devices and Music Interaction Laboratory. Plug-in-Gait Marker Placement. Available at: http://www.idmil.org/mocap/Plug-in-Gait+Marker+Placement.pdf
- Bogdanov, D., Wack, N., Gómez, E., Gulati, S., Herrera, P., Mayor, O., Roma, G., Salamon, J., Zapata, J., and Serra, X. (2013). ESSENTIA: an Audio Analysis Library for Music Information Retrieval. In International Society for Music Information Retrieval Conference (ISMIR 2013) (pp. 493-498).
- Marchini, M., Papiotis, P., Perez-Carrillo, A., and Maestre, E. (2011). A Hair Ribbon Deflection Model for Low-intrusiveness Measurement of Bow Force in Violin Performance. In NIME (pp. 481-486).
- Dixon, S. (2005). Live tracking of musical performances using on-line time warping. In 8th International Conference on Digital Audio Effects (pp. 92 -97).
- Heimann, M. (1958). Exercises for String Quartet. (H. E. Deckert and F. Marcus, Eds.) (2007 ed.). European String Teachers Association (Denmark branch), ACMP Chamber Music Network. Available at: http://www.acmp.net/media/heimann\_exercises.pdf
- Music Technology Group, Universitat Pompeu Fabra (2016). QUARTET dataset. Available at: http://mtg.upf.edu/download/datasets/quartet-dataset