Guarding against artificial intelligence--hallucinated citations: the case for full-text reference deposit
By: Alex Glynn
Potential Business Impact:
Stops AI from making up fake sources.
The tendency of generative artificial intelligence (AI) systems to "hallucinate" false information is well-known; AI-generated citations to non-existent sources have made their way into the reference lists of peer-reviewed publications. Here, I propose a solution to this problem, taking inspiration from the Transparency and Openness Promotion (TOP) data sharing guidelines, the clash of generative AI with the American judiciary, and the precedent set by submissions of prior art to the United States Patent and Trademark Office. Journals should require authors to submit the full text of each cited source along with their manuscripts, thereby preventing authors from citing any material whose full text they cannot produce. This solution requires limited additional work on the part of authors or editors while effectively immunizing journals against hallucinated references.
Similar Papers
SemanticCite: Citation Verification with AI-Powered Full-Text Analysis and Evidence-Based Reasoning
Computation and Language
Checks if research papers correctly mention their sources.
Rethinking Citation of AI Sources in Student-AI Collaboration within HCI Design Education
Human-Computer Interaction
Teaches students how to properly credit AI help.
Hallucinations in Bibliographic Recommendation: Citation Frequency as a Proxy for Training Data Redundancy
Computation and Language
Makes AI remember real papers, not fake ones.