Score: 1

Taming Cold Starts: Proactive Serverless Scheduling with Model Predictive Control

Published: August 11, 2025 | arXiv ID: 2508.07640v1

By: Chanh Nguyen, Monowar Bhuyan, Erik Elmroth

Potential Business Impact:

Makes apps run faster by guessing what's needed.

Serverless computing has transformed cloud application deployment by introducing a fine-grained, event-driven execution model that abstracts away infrastructure management. Its on-demand nature makes it especially appealing for latency-sensitive and bursty workloads. However, the cold start problem, i.e., where the platform incurs significant delay when provisioning new containers, remains the Achilles' heel of such platforms. This paper presents a predictive serverless scheduling framework based on Model Predictive Control to proactively mitigate cold starts, thereby improving end-to-end response time. By forecasting future invocations, the controller jointly optimizes container prewarming and request dispatching, improving latency while minimizing resource overhead. We implement our approach on Apache OpenWhisk, deployed on a Kubernetes-based testbed. Experimental results using real-world function traces and synthetic workloads demonstrate that our method significantly outperforms state-of-the-art baselines, achieving up to 85% lower tail latency and a 34% reduction in resource usage.

Country of Origin
πŸ‡ΈπŸ‡ͺ Sweden

Page Count
8 pages

Category
Computer Science:
Distributed, Parallel, and Cluster Computing