Enhancing Speech-to-Speech Dialogue Modeling with End-to-End Retrieval-Augmented Generation
By: Pengchao Feng , Ziyang Ma , Wenxi Chen and more
Potential Business Impact:
Lets talking computers use outside information.
In recent years, end-to-end speech-to-speech (S2S) dialogue systems have garnered increasing research attention due to their advantages over traditional cascaded systems, including achieving lower latency and more natural integration of nonverbal cues such as emotion and speaker identity. However, these end-to-end systems face key challenges, particularly in incorporating external knowledge, a capability commonly addressed by Retrieval-Augmented Generation (RAG) in text-based large language models (LLMs). The core difficulty lies in the modality gap between input speech and retrieved textual knowledge, which hinders effective integration. To address this issue, we propose a novel end-to-end RAG framework that directly retrieves relevant textual knowledge from speech queries, eliminating the need for intermediate speech-to-text conversion via techniques like ASR. Experimental results demonstrate that our method significantly improves the performance of end-to-end S2S dialogue systems while achieving higher retrieval efficiency. Although the overall performance still lags behind cascaded models, our framework offers a promising direction for enhancing knowledge integration in end-to-end S2S systems. We will release the code and dataset to support reproducibility and promote further research in this area.
Similar Papers
Stream RAG: Instant and Accurate Spoken Dialogue Systems with Streaming Tool Usage
Computation and Language
AI answers questions faster by guessing what you'll ask.
A Survey on Knowledge-Oriented Retrieval-Augmented Generation
Computation and Language
Lets computers use outside facts to answer questions.
A Knowledge Graph and a Tripartite Evaluation Framework Make Retrieval-Augmented Generation Scalable and Transparent
Information Retrieval
Chatbots answer questions more accurately and reliably.