RIGOURATE: Quantifying Scientific Exaggeration with Evidence-Aligned Claim Evaluation
By: Joseph James , Chenghao Xiao , Yucheng Li and more
Potential Business Impact:
Finds fake science claims in research papers.
Scientific rigour tends to be sidelined in favour of bold statements, leading authors to overstate claims beyond what their results support. We present RIGOURATE, a two-stage multimodal framework that retrieves supporting evidence from a paper's body and assigns each claim an overstatement score. The framework consists of a dataset of over 10K claim-evidence sets from ICLR and NeurIPS papers, annotated using eight LLMs, with overstatement scores calibrated using peer-review comments and validated through human evaluation. It employes a fine-tuned reranker for evidence retrieval and a fine-tuned model to predict overstatement scores with justification. Compared to strong baselines, RIGOURATE enables improved evidence retrieval and overstatement detection. Overall, our work operationalises evidential proportionality and supports clearer, more transparent scientific communication.
Similar Papers
RAVE: Retrieval and Scoring Aware Verifiable Claim Detection
Computation and Language
Finds fake news faster online.
Factuality and Transparency Are All RAG Needs! Self-Explaining Contrastive Evidence Re-ranking
Computation and Language
Helps computers find true facts and explain why.
VeriCite: Towards Reliable Citations in Retrieval-Augmented Generation via Rigorous Verification
Information Retrieval
Makes AI answers more truthful with proof.