Score: 1

InforME: Improving Informativeness of Abstractive Text Summarization With Informative Attention Guided by Named Entity Salience

Published: October 7, 2025 | arXiv ID: 2510.05769v1

By: Jianbin Shen, Christy Jie Liang, Junyu Xuan

Potential Business Impact:

Makes long texts shorter but still useful.

Business Areas:
Text Analytics Data and Analytics, Software

Abstractive text summarization is integral to the Big Data era, which demands advanced methods to turn voluminous and often long text data into concise but coherent and informative summaries for efficient human consumption. Despite significant progress, there is still room for improvement in various aspects. One such aspect is to improve informativeness. Hence, this paper proposes a novel learning approach consisting of two methods: an optimal transport-based informative attention method to improve learning focal information in reference summaries and an accumulative joint entropy reduction method on named entities to enhance informative salience. Experiment results show that our approach achieves better ROUGE scores compared to prior work on CNN/Daily Mail while having competitive results on XSum. Human evaluation of informativeness also demonstrates the better performance of our approach over a strong baseline. Further analysis gives insight into the plausible reasons underlying the evaluation results.

Country of Origin
🇦🇺 Australia


Page Count
18 pages

Category
Computer Science:
Computation and Language