Codified Foreshadowing-Payoff Text Generation
By: Longfei Yun , Kun Zhou , Yupeng Hou and more
Foreshadowing and payoff are ubiquitous narrative devices through which authors introduce commitments early in a story and resolve them through concrete, observable outcomes. However, despite advances in story generation, large language models (LLMs) frequently fail to bridge these long-range narrative dependencies, often leaving "Chekhov's guns" unfired even when the necessary context is present. Existing evaluations largely overlook this structural failure, focusing on surface-level coherence rather than the logical fulfillment of narrative setups. In this paper, we introduce Codified Foreshadowing-Payoff Generation (CFPG), a novel framework that reframes narrative quality through the lens of payoff realization. Recognizing that LLMs struggle to intuitively grasp the "triggering mechanism" of a foreshadowed event, CFPG transforms narrative continuity into a set of executable causal predicates. By mining and encoding Foreshadow-Trigger-Payoff triples from the BookSum corpus, we provide structured supervision that ensures foreshadowed commitments are not only mentioned but also temporally and logically fulfilled. Experiments demonstrate that CFPG significantly outperforms standard prompting baselines in payoff accuracy and narrative alignment. Our findings suggest that explicitly codifying narrative mechanics is essential for moving LLMs from surface-level fluency to genuine narrative competence.
Similar Papers
Long Story Generation via Knowledge Graph and Literary Theory
Computation and Language
Writes longer, more interesting stories that don't get boring.
The Promise and Peril of Generative AI: Evidence from GPT as Sell-Side Analysts
General Finance
Helps people know when to trust AI money predictions.
Narrative-to-Scene Generation: An LLM-Driven Pipeline for 2D Game Environments
Graphics
Turns stories into playable game worlds.