Score: 0

Continuously Updating Digital Twins using Large Language Models

Published: June 11, 2025 | arXiv ID: 2506.12091v2

By: Harry Amad, Nicolás Astorga, Mihaela van der Schaar

Potential Business Impact:

Lets digital twins learn and change instantly.

Business Areas:
Simulation Software

Digital twins are models of real-world systems that can simulate their dynamics in response to potential actions. In complex settings, the state and action variables, and available data and knowledge relevant to a system can constantly change, requiring digital twins to continuously update with these changes to remain relevant. Current approaches struggle in this regard, as they require fixed, well-defined modelling environments, and they cannot adapt to novel variables without re-designs, or incorporate new information without re-training. To address this, we frame digital twinning as an in-context learning problem using large language models, enabling seamless updates to the twin at inference time. We develop CALM-DT, a Context-Adaptive Language Model-based Digital Twin that can accurately simulate across diverse state-action spaces using in-context learning alone by utilising fine-tuned encoders for sample retrieval. We empirically demonstrate CALM-DT's competitive performance with existing digital twin approaches, and its unique ability to adapt to changes in its modelling environment without parameter updates.

Country of Origin
🇬🇧 United Kingdom

Page Count
24 pages

Category
Computer Science:
Computation and Language