System Identification from Partial Observations under Adversarial Attacks
By: Jihun Kim, Javad Lavaei
Potential Business Impact:
Protects computer systems from sneaky attacks.
This paper is concerned with the partially observed linear system identification, where the goal is to obtain reasonably accurate estimation of the balanced truncation of the true system up to order $k$ from output measurements. We consider the challenging case of system identification under adversarial attacks, where the probability of having an attack at each time is $\Theta(1/k)$ while the value of the attack is arbitrary. We first show that the $\ell_1$-norm estimator exactly identifies the true Markov parameter matrix for nilpotent systems under any type of attack. We then build on this result to extend it to general systems and show that the estimation error exponentially decays as $k$ grows. The estimated balanced truncation model accordingly shows an exponentially decaying error for the identification of the true system up to a similarity transformation. This work is the first to provide the input-output analysis of the system with partial observations under arbitrary attacks.
Similar Papers
Bridging Batch and Streaming Estimations to System Identification under Adversarial Attacks
Optimization and Control
Protects machines from sneaky attacks.
On the Sharp Input-Output Analysis of Nonlinear Systems under Adversarial Attacks
Optimization and Control
Makes computers learn from messy, tricky information.
Minimal Order Recovery through Rank-adaptive Identification
Systems and Control
Finds hidden patterns in messy data.