A Tale of Two Identities: An Ethical Audit of Human and AI-Crafted Personas
By: Pranav Narayanan Venkit , Jiayi Li , Yingfan Zhou and more
Potential Business Impact:
AI creates fake people that sound like stereotypes.
As LLMs (large language models) are increasingly used to generate synthetic personas particularly in data-limited domains such as health, privacy, and HCI, it becomes necessary to understand how these narratives represent identity, especially that of minority communities. In this paper, we audit synthetic personas generated by 3 LLMs (GPT4o, Gemini 1.5 Pro, Deepseek 2.5) through the lens of representational harm, focusing specifically on racial identity. Using a mixed methods approach combining close reading, lexical analysis, and a parameterized creativity framework, we compare 1512 LLM generated personas to human-authored responses. Our findings reveal that LLMs disproportionately foreground racial markers, overproduce culturally coded language, and construct personas that are syntactically elaborate yet narratively reductive. These patterns result in a range of sociotechnical harms, including stereotyping, exoticism, erasure, and benevolent bias, that are often obfuscated by superficially positive narrations. We formalize this phenomenon as algorithmic othering, where minoritized identities are rendered hypervisible but less authentic. Based on these findings, we offer design recommendations for narrative-aware evaluation metrics and community-centered validation protocols for synthetic identity generation.
Similar Papers
Algorithmic Fairness in NLP: Persona-Infused LLMs for Human-Centric Hate Speech Detection
Computation and Language
Makes AI better at spotting hate speech fairly.
Misalignment of LLM-Generated Personas with Human Perceptions in Low-Resource Settings
Computers and Society
AI personalities don't understand people like real humans.
Whose Personae? Synthetic Persona Experiments in LLM Research and Pathways to Transparency
Computers and Society
Makes AI understand people better and more fairly.