From Joy to Fear: A Benchmark of Emotion Estimation in Pop Song Lyrics
By: Shay Dahary , Avi Edana , Alexander Apartsin and more
Potential Business Impact:
Helps computers understand song feelings.
The emotional content of song lyrics plays a pivotal role in shaping listener experiences and influencing musical preferences. This paper investigates the task of multi-label emotional attribution of song lyrics by predicting six emotional intensity scores corresponding to six fundamental emotions. A manually labeled dataset is constructed using a mean opinion score (MOS) approach, which aggregates annotations from multiple human raters to ensure reliable ground-truth labels. Leveraging this dataset, we conduct a comprehensive evaluation of several publicly available large language models (LLMs) under zero-shot scenarios. Additionally, we fine-tune a BERT-based model specifically for predicting multi-label emotion scores. Experimental results reveal the relative strengths and limitations of zero-shot and fine-tuned models in capturing the nuanced emotional content of lyrics. Our findings highlight the potential of LLMs for emotion recognition in creative texts, providing insights into model selection strategies for emotion-based music information retrieval applications. The labeled dataset is available at https://github.com/LLM-HITCS25S/LyricsEmotionAttribution.
Similar Papers
A Study on the Data Distribution Gap in Music Emotion Recognition
Sound
Helps computers understand music's feelings better.
GlobalMood: A cross-cultural benchmark for music emotion recognition
Information Retrieval
Helps music understand feelings from around the world.
SemEval-2025 Task 11: Bridging the Gap in Text-Based Emotion Detection
Computation and Language
Helps computers understand feelings in many languages.