The Malicious Technical Ecosystem: Exposing Limitations in Technical Governance of AI-Generated Non-Consensual Intimate Images of Adults
By: Michelle L. Ding, Harini Suresh
Potential Business Impact:
Stops fake nude pictures from being made.
In this paper, we adopt a survivor-centered approach to locate and dissect the role of sociotechnical AI governance in preventing AI-Generated Non-Consensual Intimate Images (AIG-NCII) of adults, colloquially known as "deep fake pornography." We identify a "malicious technical ecosystem" or "MTE," comprising of open-source face-swapping models and nearly 200 "nudifying" software programs that allow non-technical users to create AIG-NCII within minutes. Then, using the National Institute of Standards and Technology (NIST) AI 100-4 report as a reflection of current synthetic content governance methods, we show how the current landscape of practices fails to effectively regulate the MTE for adult AIG-NCII, as well as flawed assumptions explaining these gaps.
Similar Papers
Perpetuating Misogyny with Generative AI: How Model Personalization Normalizes Gendered Harm
Computers and Society
Stops AI from making fake, harmful pictures.
Deepfakes on Demand: the rise of accessible non-consensual deepfake image generators
Computers and Society
Makes fake pictures of people easily.
Decoding the Black Box: Integrating Moral Imagination with Technical AI Governance
Systems and Control
Makes AI safer and fairer for everyone.