SegNSP: Revisiting Next Sentence Prediction for Linear Text Segmentation
By: José Isidro , Filipe Cunha , Purificação Silvano and more
Potential Business Impact:
Helps computers understand where one topic ends.
Linear text segmentation is a long-standing problem in natural language processing (NLP), focused on dividing continuous text into coherent and semantically meaningful units. Despite its importance, the task remains challenging due to the complexity of defining topic boundaries, the variability in discourse structure, and the need to balance local coherence with global context. These difficulties hinder downstream applications such as summarization, information retrieval, and question answering. In this work, we introduce SegNSP, framing linear text segmentation as a next sentence prediction (NSP) task. Although NSP has largely been abandoned in modern pre-training, its explicit modeling of sentence-to-sentence continuity makes it a natural fit for detecting topic boundaries. We propose a label-agnostic NSP approach, which predicts whether the next sentence continues the current topic without requiring explicit topic labels, and enhance it with a segmentation-aware loss combined with harder negative sampling to better capture discourse continuity. Unlike recent proposals that leverage NSP alongside auxiliary topic classification, our approach avoids task-specific supervision. We evaluate our model against established baselines on two datasets, CitiLink-Minutes, for which we establish the first segmentation benchmark, and WikiSection. On CitiLink-Minutes, SegNSP achieves a B-$F_1$ of 0.79, closely aligning with human-annotated topic transitions, while on WikiSection it attains a B-F$_1$ of 0.65, outperforming the strongest reproducible baseline, TopSeg, by 0.17 absolute points. These results demonstrate competitive and robust performance, highlighting the effectiveness of modeling sentence-to-sentence continuity for improving segmentation quality and supporting downstream NLP applications.
Similar Papers
BP-Seg: A graphical model approach to unsupervised and non-contiguous text segmentation using belief propagation
Computation and Language
Splits long texts into meaningful parts.
Paragraph Segmentation Revisited: Towards a Standard Task for Structuring Speech
Computation and Language
Makes spoken words into easy-to-read paragraphs.
Testing Cross-Lingual Text Comprehension In LLMs Using Next Sentence Prediction
Computation and Language
Tests show AI struggles with less common languages.