Improving Regulatory Oversight in Online Content Moderation
By: Benedetta Tessa , Denise Amram , Anna Monreale and more
Potential Business Impact:
Makes online platforms more honest about content.
The European Union introduced the Digital Services Act (DSA) to address the risks associated with digital platforms and promote a safer online environment. However, despite the potential of components such as the Transparency Database, Transparency Reports, and Article 40 of the DSA to improve platform transparency, significant challenges remain. These include data inconsistencies and a lack of detailed information, which hinder transparency in content moderation practices. Additionally, the absence of standardized reporting structures makes cross-platform comparisons and broader analyses difficult. To address these issues, we propose two complementary processes: a Transparency Report Cross-Checking Process and a Verification Process. Their goal is to provide both internal and external validation by detecting possible inconsistencies between self-reported and actual platform data, assessing compliance levels, and ultimately enhancing transparency while improving the overall effectiveness of the DSA in ensuring accountability in content moderation. Additionally, these processes can benefit policymakers by providing more accurate data for decision-making, independent researchers with trustworthy analysis, and platforms by offering a method for self-assessment and improving compliance and reporting practices.
Similar Papers
A Year of the DSA Transparency Database: What it (Does Not) Reveal About Platform Moderation During the 2024 European Parliament Election
Computers and Society
Social media didn't change how they stop bad posts.
Research Opportunities and Challenges of the EU's Digital Services Act
Computers and Society
Lets researchers see how big websites work.
The EU Digital Services Act: what does it mean for online advertising and adtech?
Computers and Society
Limits online ads targeting kids and sensitive data.