FedRAG: A Framework for Fine-Tuning Retrieval-Augmented Generation Systems
By: Val Andrei Fajardo , David B. Emerson , Amandeep Singh and more
Potential Business Impact:
Helps AI learn from more information better.
Retrieval-augmented generation (RAG) systems have been shown to be effective in addressing many of the drawbacks of relying solely on the parametric memory of large language models. Recent work has demonstrated that RAG systems can be improved via fine-tuning of their retriever and generator models. In this work, we introduce FedRAG, a framework for fine-tuning RAG systems across centralized and federated architectures. FedRAG supports state-of-the-art fine-tuning methods, offering a simple and intuitive interface and a seamless conversion from centralized to federated training tasks. FedRAG is also deeply integrated with the modern RAG ecosystem, filling a critical gap in available tools.
Similar Papers
FlexRAG: A Flexible and Comprehensive Framework for Retrieval-Augmented Generation
Computation and Language
Builds smarter AI that learns from more information.
Privacy-Preserving Federated Embedding Learning for Localized Retrieval-Augmented Generation
Computation and Language
Keeps private info safe while AI learns.
Retrieval-Augmented Generation: A Comprehensive Survey of Architectures, Enhancements, and Robustness Frontiers
Information Retrieval
Helps computers answer questions with real-world facts.