A Close Reading Approach to Gender Narrative Biases in AI-Generated Stories
By: Daniel Raffini , Agnese Macori , Marco Angelini and more
Potential Business Impact:
Finds gender bias in AI stories.
The paper explores the study of gender-based narrative biases in stories generated by ChatGPT, Gemini, and Claude. The prompt design draws on Propp's character classifications and Freytag's narrative structure. The stories are analyzed through a close reading approach, with particular attention to adherence to the prompt, gender distribution of characters, physical and psychological descriptions, actions, and finally, plot development and character relationships. The results reveal the persistence of biases - especially implicit ones - in the generated stories and highlight the importance of assessing biases at multiple levels using an interpretative approach.
Similar Papers
Biased Tales: Cultural and Topic Bias in Generating Children's Stories
Computation and Language
AI stories show unfair gender and culture bias.
Examining Multimodal Gender and Content Bias in ChatGPT-4o
Computers and Society
AI blocks sex pictures, allows violence, favors boys.
Investigating Gender Bias in LLM-Generated Stories via Psychological Stereotypes
Computation and Language
Finds how stories show gender bias.