Emotional Data in Music Performance : Two Audio Environments for the Emotional Imaging Composer
Winters, R. M., Hattwick, I. & Wanderley, M. M. (2013). Emotional Data in Music Performance : Two Audio Environments for the Emotional Imaging Composer. In: Proceedings of the 3rd International Conference on Music & Emotion (ICME3), Jyväskylä, Finland, 11th - 15th June 2013. Geoff Luck & Olivier Brabant (Eds.). University of Jyväskylä, Department of Music.
Technologies capable of automatically sensing and recognizing emotion are becoming increasingly prevalent in performance and compositional practice. Though these technologies are complex and diverse, we present a typology that draws on similarities with computational systems for expressive music performance. This typology provides a framework to present results from the development of two audio environments for the Emotional Imaging Composer, a commercial product for realtime arousal/valence recognition that uses signals from the autonomic nervous system. In the first environment, a spectral delay processor for live vocal performance uses the performer's emotional state to interpolate between subspaces of the arousal/valence plane. For the second, a sonification mapping communicates continuous arousal and valence measurements using tempo, loudness, decay, mode, and roughness. Both were informed by empirical research on musical emotion, though differences in desired output schemas manifested in different mapping strategies. ...
PublisherUniversity of Jyväskylä, Department of Music
ConferenceThe 3rd International Conference on Music & Emotion, Jyväskylä, Finland, June 11-15, 2013
Is part of publicationProceedings of the 3rd International Conference on Music & Emotion (ICME3), Jyväskylä, Finland, 11th - 15th June 2013. Geoff Luck & Olivier Brabant (Eds.). ISBN 978-951-39-5250-1
MetadataShow full item record
- ICME 2013