Exploring relationships between effort, motion, and sound in new musical instruments
Erdem, Çağrı; Lan, Qichao; Jensenius, Alexander Refsum (2020). Exploring relationships between effort, motion, and sound in new musical instruments. Human Technology, 16 (3), 310-347. DOI: 10.17011/ht/urn.202011256767
Julkaistu sarjassa
Human Technology: An Interdisciplinary Journal on Humans in ICT EnvironmentsPäivämäärä
2020Tekijänoikeudet
©2020 Çağrı Erdem, Qichao Lan, & Alexander Refsum Jensenius, and the Open Science Centre, University of Jyväskylä
We investigated how the action–sound relationships found in electric guitar
performance can be used in the design of new instruments. Thirty-one trained guitarists
performed a set of basic sound-producing actions (impulsive, sustained, and iterative) and
free improvisations on an electric guitar. We performed a statistical analysis of the muscle
activation data (EMG) and audio recordings from the experiment. Then we trained a long
short-term memory network with nine different configurations to map EMG signal to sound.
We found that the preliminary models were able to predict audio energy features of free
improvisations on the guitar, based on the dataset of raw EMG from the basic soundproducing actions. The results provide evidence of similarities between body motion and
sound in music performance, compatible with embodied music cognition theories. They also
show the potential of using machine learning on recorded performance data in the design of
new musical instruments.
Julkaisija
Jyväskylän YliopistoISSN Hae Julkaisufoorumista
1795-6889Asiasanat
Metadata
Näytä kaikki kuvailutiedotKokoelmat
- Human technology [245]
Lisenssi
Samankaltainen aineisto
Näytetään aineistoja, joilla on samankaltainen nimeke tai asiasanat.
-
Relationships Between Audio and Movement Features, and Perceived Emotions in Musical Performance
Thompson, Marc R.; Mendoza, Juan Ignacio; Luck, Geoff; Vuoskoski, Jonna K. (SAGE Publications, 2023)A core aspect of musical performance is communicating emotional and expressive intentions to the audience. Recognition of the musician's intentions is constructed from a combination of visual and auditory performance cues, ... -
Exploring relationships between audio features and emotion in music
Laurier, Cyril; Lartillot, Olivier; Eerola, Tuomas; Toiviainen, Petri (2009)In this paper, we present an analysis of the associations between emotion categories and audio features automatically extracted from raw audio data. This work is based on 110 excerpts from film soundtracks evaluated by 116 ... -
Dance to your own drum : identification of musical genre and individual dancer from motion capture using machine learning
Carlson, Emily; Saari, Pasi; Burger, Birgitta; Toiviainen, Petri (Routledge, 2020)Machine learning has been used to accurately classify musical genre using features derived from audio signals. Musical genre, as well as lower-level audio features of music, have also been shown to influence music-induced ... -
The Relation Between Emotional Valence and Performance Motion of the Keyboard Instrument
Mito, Yuki; Kawakami, Hiroshi; Miura, Masanobu; Shinoda, Yukitaka (University of Jyväskylä, Department of Music, 2013)This study examined relations of the emotion and part of the upper body during the keyboard instrument performance. The study showed the trace from the lateral direction to watch movement. As a result, there was a difference ...
Ellei toisin mainittu, julkisesti saatavilla olevia JYX-metatietoja (poislukien tiivistelmät) saa vapaasti uudelleenkäyttää CC0-lisenssillä.