Computational Fact-Checking of Online Discourse: Scoring scientific accuracy in climate change related news articles
By: Tim Wittenborg , Constantin Sebastian Tremel , Markus Stocker and more
Potential Business Impact:
Checks if online news is true.
Democratic societies need reliable information. Misinformation in popular media such as news articles or videos threatens to impair civic discourse. Citizens are, unfortunately, not equipped to verify this content flood consumed daily at increasing rates. This work aims to semi-automatically quantify scientific accuracy of online media. By semantifying media of unknown veracity, their statements can be compared against equally processed trusted sources. We implemented a workflow using LLM-based statement extraction and knowledge graph analysis. Our neurosymbolic system was able to evidently streamline state-of-the-art veracity quantification. Evaluated via expert interviews and a user survey, the tool provides a beneficial veracity indication. This indicator, however, is unable to annotate public media at the required granularity and scale. Further work towards a FAIR (Findable, Accessible, Interoperable, Reusable) ground truth and complementary metrics are required to scientifically support civic discourse.
Similar Papers
More Than Just Warnings:Exploring the Ways of Communicating Credibility Assessment on Social Media
Human-Computer Interaction
Helps people spot fake news better online.
Profiling News Media for Factuality and Bias Using LLMs and the Fact-Checking Methodology of Human Experts
Computation and Language
Helps tell if news sources are trustworthy and biased.
SciCom Wiki: Fact-Checking and FAIR Knowledge Distribution for Scientific Videos and Podcasts
Digital Libraries
Checks videos and podcasts for fake news.