Considering Length Diversity in Retrieval-Augmented Summarization
By: Juseon-Do , Jaesung Hwang , Jingun Kwon and more
Potential Business Impact:
Makes AI summaries shorter and faster.
This study investigates retrieval-augmented summarization by specifically examining the impact of exemplar summary lengths under length constraints, not covered by previous work. We propose a Diverse Length-aware Maximal Marginal Relevance (DL-MMR) algorithm to better control summary lengths. This algorithm combines the query relevance with diverse target lengths in retrieval-augmented summarization. Unlike previous methods that necessitate exhaustive exemplar exemplar relevance comparisons using MMR, DL-MMR considers the exemplar target length as well and avoids comparing exemplars to each other, thereby reducing computational cost and conserving memory during the construction of an exemplar pool. Experimental results showed the effectiveness of DL-MMR, which considers length diversity, compared to the original MMR algorithm. DL-MMR additionally showed the effectiveness in memory saving of 781,513 times and computational cost reduction of 500,092 times, while maintaining the same level of informativeness.
Similar Papers
Estimating Optimal Context Length for Hybrid Retrieval-augmented Multi-document Summarization
Computation and Language
Helps computers summarize many documents better.
An Empirical Comparison of Text Summarization: A Multi-Dimensional Evaluation of Large Language Models
Computation and Language
Finds best AI for summarizing text.
Evaluating the Effectiveness and Scalability of LLM-Based Data Augmentation for Retrieval
Information Retrieval
Makes search engines smarter with less effort.