Score: 0

Redefining Machine Simultaneous Interpretation: From Incremental Translation to Human-Like Strategies

Published: January 16, 2026 | arXiv ID: 2601.11002v1

By: Qianen Zhang, Zeyu Yang, Satoshi Nakamura

Potential Business Impact:

Translates languages faster by changing sentences.

Business Areas:
Semantic Search Internet Services

Simultaneous Machine Translation (SiMT) requires high-quality translations under strict real-time constraints, which traditional policies with only READ/WRITE actions cannot fully address. We extend the action space of SiMT with four adaptive actions: Sentence_Cut, Drop, Partial_Summarization and Pronominalization, which enable real-time restructuring, omission, and simplification while preserving semantic fidelity. We adapt these actions in a large language model (LLM) framework and construct training references through action-aware prompting. To evaluate both quality and word-level monotonicity, we further develop a latency-aware TTS pipeline that maps textual outputs to speech with realistic timing. Experiments on the ACL60/60 English-Chinese, English-German and English-Japanese benchmarks show that our framework consistently improves semantic metrics and achieves lower delay compared to reference translations and salami-based baselines. Notably, combining Drop and Sentence_Cut leads to consistent improvements in the balance between fluency and latency. These results demonstrate that enriching the action space of LLM-based SiMT provides a promising direction for bridging the gap between human and machine interpretation.

Country of Origin
🇭🇰 Hong Kong

Page Count
19 pages

Category
Computer Science:
Computation and Language