InforME: Improving Informativeness of Abstractive Text Summarization With Informative Attention Guided by Named Entity Salience
By: Jianbin Shen, Christy Jie Liang, Junyu Xuan
Potential Business Impact:
Makes long texts shorter but still useful.
Abstractive text summarization is integral to the Big Data era, which demands advanced methods to turn voluminous and often long text data into concise but coherent and informative summaries for efficient human consumption. Despite significant progress, there is still room for improvement in various aspects. One such aspect is to improve informativeness. Hence, this paper proposes a novel learning approach consisting of two methods: an optimal transport-based informative attention method to improve learning focal information in reference summaries and an accumulative joint entropy reduction method on named entities to enhance informative salience. Experiment results show that our approach achieves better ROUGE scores compared to prior work on CNN/Daily Mail while having competitive results on XSum. Human evaluation of informativeness also demonstrates the better performance of our approach over a strong baseline. Further analysis gives insight into the plausible reasons underlying the evaluation results.
Similar Papers
Efficient Extractive Text Summarization for Online News Articles Using Machine Learning
Machine Learning (CS)
Makes news articles shorter and easier to read.
AugAbEx : Way Forward for Extractive Case Summarization
Computation and Language
Helps lawyers quickly understand court cases.
An Empirical Comparison of Text Summarization: A Multi-Dimensional Evaluation of Large Language Models
Computation and Language
Finds best AI for summarizing text.