Echoes of Humanity: Exploring the Perceived Humanness of AI Music
By: Flavio Figueiredo , Giovanni Martinelli , Henrique Sousa and more
Potential Business Impact:
Helps tell if music is made by AI or people.
Recent advances in AI music (AIM) generation services are currently transforming the music industry. Given these advances, understanding how humans perceive AIM is crucial both to educate users on identifying AIM songs, and, conversely, to improve current models. We present results from a listener-focused experiment aimed at understanding how humans perceive AIM. In a blind, Turing-like test, participants were asked to distinguish, from a pair, the AIM and human-made song. We contrast with other studies by utilizing a randomized controlled crossover trial that controls for pairwise similarity and allows for a causal interpretation. We are also the first study to employ a novel, author-uncontrolled dataset of AIM songs from real-world usage of commercial models (i.e., Suno). We establish that listeners' reliability in distinguishing AIM causally increases when pairs are similar. Lastly, we conduct a mixed-methods content analysis of listeners' free-form feedback, revealing a focus on vocal and technical cues in their judgments.
Similar Papers
Perception of AI-Generated Music -- The Role of Composer Identity, Personality Traits, Music Preferences, and Perceived Humanness
Human-Computer Interaction
Helps AI understand what people like in music.
Exploring listeners' perceptions of AI-generated and human-composed music for functional emotional applications
Human-Computer Interaction
People like AI music more, even if they think it's human.
MusicAIR: A Multimodal AI Music Generation Framework Powered by an Algorithm-Driven Core
Sound
Makes songs from just words and pictures.