Content Moderation Futures
By: Lindsay Blackwell
Potential Business Impact:
Fixes social media problems by helping workers.
This study examines the failures and possibilities of contemporary social media governance through the lived experiences of various content moderation professionals. Drawing on participatory design workshops with 33 practitioners in both the technology industry and broader civil society, this research identifies significant structural misalignments between corporate incentives and public interests. While experts agree that successful content moderation is principled, consistent, contextual, proactive, transparent, and accountable, current technology companies fail to achieve these goals, due in part to exploitative labor practices, chronic underinvestment in user safety, and pressures of global scale. I argue that successful governance is undermined by the pursuit of technological novelty and rapid growth, resulting in platforms that necessarily prioritize innovation and expansion over public trust and safety. To counter this dynamic, I revisit the computational history of care work, to motivate present-day solidarity amongst platform governance workers and inspire systemic change.
Similar Papers
Wellbeing-Centered UX: Supporting Content Moderators
Human-Computer Interaction
Helps people who review online posts feel better.
Mapping Community Appeals Systems: Lessons for Community-led Moderation in Multi-Level Governance
Human-Computer Interaction
Helps online groups make fair rules for everyone.
"I thought it was my mistake, but it's really the design'': A Critical Examination of the Accessibility of User-Enacted Moderation Tools on Facebook and X
Human-Computer Interaction
Makes social media safer for people who can't see.