Score: 0

AI Gossip

Published: August 11, 2025 | arXiv ID: 2508.08143v1

By: Joel Krueger, Lucy Osler

Potential Business Impact:

AI can spread rumors, causing new kinds of harm.

Generative AI chatbots like OpenAI's ChatGPT and Google's Gemini routinely make things up. They "hallucinate" historical events and figures, legal cases, academic papers, non-existent tech products and features, biographies, and news articles. Recently, some have argued that these hallucinations are better understood as bullshit. Chatbots produce rich streams of text that look truth-apt without any concern for the truthfulness of what this text says. But can they also gossip? We argue that they can. After some definitions and scene-setting, we focus on a recent example to clarify what AI gossip looks like before considering some distinct harms -- what we call "technosocial harms" -- that follow from it.

Page Count
28 pages

Category
Computer Science:
Computers and Society