Score: 0

Community Notes are Vulnerable to Rater Bias and Manipulation

Published: November 4, 2025 | arXiv ID: 2511.02615v1

By: Bao Tran Truong , Siqi Wu , Alessandro Flammini and more

Potential Business Impact:

Fixes social media notes to stop bias.

Business Areas:
Social News Media and Entertainment

Social media platforms increasingly rely on crowdsourced moderation systems like Community Notes to combat misinformation at scale. However, these systems face challenges from rater bias and potential manipulation, which may undermine their effectiveness. Here we systematically evaluate the Community Notes algorithm using simulated data that models realistic rater and note behaviors, quantifying error rates in publishing helpful versus unhelpful notes. We find that the algorithm suppresses a substantial fraction of genuinely helpful notes and is highly sensitive to rater biases, including polarization and in-group preferences. Moreover, a small minority (5--20\%) of bad raters can strategically suppress targeted helpful notes, effectively censoring reliable information. These findings suggest that while community-driven moderation may offer scalability, its vulnerability to bias and manipulation raises concerns about reliability and trustworthiness, highlighting the need for improved mechanisms to safeguard the integrity of crowdsourced fact-checking.

Country of Origin
🇺🇸 United States

Page Count
30 pages

Category
Computer Science:
Social and Information Networks