Score: 1

Show me the evidence: Evaluating the role of evidence and natural language explanations in AI-supported fact-checking

Published: January 16, 2026 | arXiv ID: 2601.11387v1

By: Greta Warren , Jingyi Sun , Irina Shklovski and more

Potential Business Impact:

People trust AI more when they see proof.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Although much research has focused on AI explanations to support decisions in complex information-seeking tasks such as fact-checking, the role of evidence is surprisingly under-researched. In our study, we systematically varied explanation type, AI prediction certainty, and correctness of AI system advice for non-expert participants, who evaluated the veracity of claims and AI system predictions. Participants were provided the option of easily inspecting the underlying evidence. We found that participants consistently relied on evidence to validate AI claims across all experimental conditions. When participants were presented with natural language explanations, evidence was used less frequently although they relied on it when these explanations seemed insufficient or flawed. Qualitative data suggests that participants attempted to infer evidence source reliability, despite source identities being deliberately omitted. Our results demonstrate that evidence is a key ingredient in how people evaluate the reliability of information presented by an AI system and, in combination with natural language explanations, offers valuable support for decision-making. Further research is urgently needed to understand how evidence ought to be presented and how people engage with it in practice.

Country of Origin
πŸ‡©πŸ‡° Denmark

Repos / Data Links

Page Count
22 pages

Category
Computer Science:
Human-Computer Interaction