Score: 1

Multitask Battery Management with Flexible Pretraining

Published: September 1, 2025 | arXiv ID: 2509.01323v1

By: Hong Lu , Jiali Chen , Jingzhao Zhang and more

Potential Business Impact:

Helps batteries work better with less data.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Industrial-scale battery management involves various types of tasks, such as estimation, prediction, and system-level diagnostics. Each task employs distinct data across temporal scales, sensor resolutions, and data channels. Building task-specific methods requires a great deal of data and engineering effort, which limits the scalability of intelligent battery management. Here we present the Flexible Masked Autoencoder (FMAE), a flexible pretraining framework that can learn with missing battery data channels and capture inter-correlations across data snippets. FMAE learns unified battery representations from heterogeneous data and can be adopted by different tasks with minimal data and engineering efforts. Experimentally, FMAE consistently outperforms all task-specific methods across five battery management tasks with eleven battery datasets. On remaining life prediction tasks, FMAE uses 50 times less inference data while maintaining state-of-the-art results. Moreover, when real-world data lack certain information, such as system voltage, FMAE can still be applied with marginal performance impact, achieving comparable results with the best hand-crafted features. FMAE demonstrates a practical route to a flexible, data-efficient model that simplifies real-world multi-task management of dynamical systems.

Country of Origin
🇨🇳 China

Page Count
26 pages

Category
Computer Science:
Machine Learning (CS)