Do AI Voices Learn Social Nuances? A Case of Politeness and Speech Rate
By: Eyal Rabin , Zohar Elyoseph , Rotem Israel-Fishelson and more
Potential Business Impact:
AI voices slow down to sound more polite.
Voice-based artificial intelligence is increasingly expected to adhere to human social conventions, but can it learn implicit cues that are not explicitly programmed? This study investigates whether state-of-the-art text-to-speech systems have internalized the human tendency to reduce speech rate to convey politeness - a non-obvious prosodic marker. We prompted 22 synthetic voices from two leading AI platforms (AI Studio and OpenAI) to read a fixed script under both "polite and formal" and "casual and informal" conditions and measured the resulting speech duration. Across both AI platforms, the polite prompt produced slower speech than the casual prompt with very large effect sizes, an effect that was statistically significant for all of AI Studio's voices and for a large majority of OpenAI's voices. These results demonstrate that AI can implicitly learn and replicate psychological nuances of human communication, highlighting its emerging role as a social actor capable of reinforcing human social norms.
Similar Papers
Will AI shape the way we speak? The emerging sociolinguistic influence of synthetic voices
Computers and Society
AI voices change how people talk and think.
Mind Your Tone: Investigating How Prompt Politeness Affects LLM Accuracy (short paper)
Computation and Language
Being rude to AI makes it answer better.
AI Models Exceed Individual Human Accuracy in Predicting Everyday Social Norms
Artificial Intelligence
AI learns right from wrong just by reading.