Music Interpretation and Emotion Perception: A Computational and Neurophysiological Investigation
By: Vassilis Lyberatos , Spyridon Kantarelis , Ioanna Zioga and more
Potential Business Impact:
Makes music more emotional and engaging for listeners.
This study investigates emotional expression and perception in music performance using computational and neurophysiological methods. The influence of different performance settings, such as repertoire, diatonic modal etudes, and improvisation, as well as levels of expressiveness, on performers' emotional communication and listeners' reactions is explored. Professional musicians performed various tasks, and emotional annotations were provided by both performers and the audience. Audio analysis revealed that expressive and improvisational performances exhibited unique acoustic features, while emotion analysis showed stronger emotional responses. Neurophysiological measurements indicated greater relaxation in improvisational performances. This multimodal study highlights the significance of expressivity in enhancing emotional communication and audience engagement.
Similar Papers
Expressive Music Data Processing and Generation
Sound
AI learns to make music sound more human.
Exploring listeners' perceptions of AI-generated and human-composed music for functional emotional applications
Human-Computer Interaction
People like AI music more, even if they think it's human.
Exploring the correlation between the type of music and the emotions evoked: A study using subjective questionnaires and EEG
CV and Pattern Recognition
Music changes your feelings and brain waves.