Continuously Updating Digital Twins using Large Language Models
By: Harry Amad, Nicolás Astorga, Mihaela van der Schaar
Potential Business Impact:
Lets digital twins learn and change instantly.
Digital twins are models of real-world systems that can simulate their dynamics in response to potential actions. In complex settings, the state and action variables, and available data and knowledge relevant to a system can constantly change, requiring digital twins to continuously update with these changes to remain relevant. Current approaches struggle in this regard, as they require fixed, well-defined modelling environments, and they cannot adapt to novel variables without re-designs, or incorporate new information without re-training. To address this, we frame digital twinning as an in-context learning problem using large language models, enabling seamless updates to the twin at inference time. We develop CALM-DT, a Context-Adaptive Language Model-based Digital Twin that can accurately simulate across diverse state-action spaces using in-context learning alone by utilising fine-tuned encoders for sample retrieval. We empirically demonstrate CALM-DT's competitive performance with existing digital twin approaches, and its unique ability to adapt to changes in its modelling environment without parameter updates.
Similar Papers
Leveraging Large Language Models for Enhanced Digital Twin Modeling: Trends, Methods, and Challenges
Emerging Technologies
Makes smart factories learn and fix themselves.
Code Digital Twin: Empowering LLMs with Tacit Knowledge for Complex Software Maintenance
Software Engineering
Helps computers understand old code to fix it.
LSDTs: LLM-Augmented Semantic Digital Twins for Adaptive Knowledge-Intensive Infrastructure Planning
Emerging Technologies
Helps plan wind farms by understanding rules.