Biased Tales: Cultural and Topic Bias in Generating Children's Stories
By: Donya Rooein , Vilém Zouhar , Debora Nozza and more
Potential Business Impact:
AI stories show unfair gender and culture bias.
Stories play a pivotal role in human communication, shaping beliefs and morals, particularly in children. As parents increasingly rely on large language models (LLMs) to craft bedtime stories, the presence of cultural and gender stereotypes in these narratives raises significant concerns. To address this issue, we present Biased Tales, a comprehensive dataset designed to analyze how biases influence protagonists' attributes and story elements in LLM-generated stories. Our analysis uncovers striking disparities. When the protagonist is described as a girl (as compared to a boy), appearance-related attributes increase by 55.26%. Stories featuring non-Western children disproportionately emphasize cultural heritage, tradition, and family themes far more than those for Western children. Our findings highlight the role of sociocultural bias in making creative AI use more equitable and diverse.
Similar Papers
Investigating Gender Bias in LLM-Generated Stories via Psychological Stereotypes
Computation and Language
Finds how stories show gender bias.
TALES: A Taxonomy and Analysis of Cultural Representations in LLM-generated Stories
Human-Computer Interaction
Finds AI stories often get Indian cultures wrong.
A Close Reading Approach to Gender Narrative Biases in AI-Generated Stories
Human-Computer Interaction
Finds gender bias in AI stories.