On the Limits of Robust Control Under Adversarial Disturbances
By: Paul Trodden, José M. Maestre, Hideaki Ishii
Potential Business Impact:
Shows when machines can't be controlled safely.
This paper addresses a fundamental and important question in control: under what conditions does there fail to exist a robust control policy that keeps the state of a constrained linear system within a target set, despite bounded disturbances? This question has practical implications for actuator and sensor specification, feasibility analysis for reference tracking, and the design of adversarial attacks in cyber-physical systems. While prior research has predominantly focused on using optimization to compute control-invariant sets to ensure feasible operation, our work complements these approaches by characterizing explicit sufficient conditions under which robust control is fundamentally infeasible. Specifically, we derive novel closed-form, algebraic expressions that relate the size of a disturbance set -- modelled as a scaled version of a basic shape -- to the system's spectral properties and the geometry of the constraint sets.
Similar Papers
A New Approach to Controlling Linear Dynamical Systems
Systems and Control
Makes robots learn faster, even when things go wrong.
On the Boundary of the Robust Admissible Set in State and Input Constrained Nonlinear Systems
Optimization and Control
Keeps self-driving cars safe from unexpected problems.
A Set-Theoretic Robust Control Approach for Linear Quadratic Games with Unknown Counterparts
Systems and Control
Helps robots learn to work safely with people.