Score: 0

Local Observability of a Class of Feedforward Neural Networks

Published: August 28, 2025 | arXiv ID: 2508.20544v1

By: Yi Yang, Victor G. Lopez, Matthias A. Müller

Potential Business Impact:

Helps computers learn better by watching how they learn.

Business Areas:
Smart Cities Real Estate

Beyond the traditional neural network training methods based on gradient descent and its variants, state estimation techniques have been proposed to determine a set of ideal weights from a control-theoretic perspective. Hence, the concept of observability becomes relevant in neural network training. In this paper, we investigate local observability of a class of two-layer feedforward neural networks~(FNNs) with rectified linear unit~(ReLU) activation functions. We analyze local observability of FNNs by evaluating an observability rank condition with respect to the weight matrix and the input sequence. First, we show that, in general, the weights of FNNs are not locally observable. Then, we provide sufficient conditions on the network structures and the weights that lead to local observability. Moreover, we propose an input design approach to render the weights distinguishable and show that this input also excites other weights inside a neighborhood. Finally, we validate our results through a numerical example.

Country of Origin
🇩🇪 Germany

Page Count
6 pages

Category
Electrical Engineering and Systems Science:
Systems and Control