Slopaganda: The interaction between propaganda and generative AI
By: Michał Klincewicz, Mark Alfano, Amir Ebrahimi Fard
Potential Business Impact:
Creates fake news that tricks people's minds.
At least since Francis Bacon, the slogan 'knowledge is power' has been used to capture the relationship between decision-making at a group level and information. We know that being able to shape the informational environment for a group is a way to shape their decisions; it is essentially a way to make decisions for them. This paper focuses on strategies that are intentionally, by design, impactful on the decision-making capacities of groups, effectively shaping their ability to take advantage of information in their environment. Among these, the best known are political rhetoric, propaganda, and misinformation. The phenomenon this paper brings out from these is a relatively new strategy, which we call slopaganda. According to The Guardian, News Corp Australia is currently churning out 3000 'local' generative AI (GAI) stories each week. In the coming years, such 'generative AI slop' will present multiple knowledge-related (epistemic) challenges. We draw on contemporary research in cognitive science and artificial intelligence to diagnose the problem of slopaganda, describe some recent troubling cases, then suggest several interventions that may help to counter slopaganda.
Similar Papers
Generative Propaganda
Computers and Society
AI creates fake news to change what people think.
AI-Generated Algorithmic Virality
Computers and Society
Finds fake AI videos spreading online.
AI Propaganda factories with language models
Cryptography and Security
Computers can now create fake political messages.