Score: 1

On the Limits of Robust Control Under Adversarial Disturbances

Published: July 4, 2025 | arXiv ID: 2507.03630v1

By: Paul Trodden, José M. Maestre, Hideaki Ishii

Potential Business Impact:

Shows when machines can't be controlled safely.

Business Areas:
Embedded Systems Hardware, Science and Engineering, Software

This paper addresses a fundamental and important question in control: under what conditions does there fail to exist a robust control policy that keeps the state of a constrained linear system within a target set, despite bounded disturbances? This question has practical implications for actuator and sensor specification, feasibility analysis for reference tracking, and the design of adversarial attacks in cyber-physical systems. While prior research has predominantly focused on using optimization to compute control-invariant sets to ensure feasible operation, our work complements these approaches by characterizing explicit sufficient conditions under which robust control is fundamentally infeasible. Specifically, we derive novel closed-form, algebraic expressions that relate the size of a disturbance set -- modelled as a scaled version of a basic shape -- to the system's spectral properties and the geometry of the constraint sets.

Country of Origin
🇪🇸 🇯🇵 🇬🇧 United Kingdom, Japan, Spain

Page Count
19 pages

Category
Electrical Engineering and Systems Science:
Systems and Control