Score: 0

Fisher information flow in artificial neural networks

Published: September 2, 2025 | arXiv ID: 2509.02407v2

By: Maximilian Weimar , Lukas M. Rachbauer , Ilya Starshynov and more

Potential Business Impact:

Helps AI learn better by tracking information flow.

Business Areas:
Intelligent Systems Artificial Intelligence, Data and Analytics, Science and Engineering

The estimation of continuous parameters from measured data plays a central role in many fields of physics. A key tool in understanding and improving such estimation processes is the concept of Fisher information, which quantifies how information about unknown parameters propagates through a physical system and determines the ultimate limits of precision. With Artificial Neural Networks (ANNs) gradually becoming an integral part of many measurement systems, it is essential to understand how they process and transmit parameter-relevant information internally. Here, we present a method to monitor the flow of Fisher information through an ANN performing a parameter estimation task, tracking it from the input to the output layer. We show that optimal estimation performance corresponds to the maximal transmission of Fisher information, and that training beyond this point results in information loss due to overfitting. This provides a model-free stopping criterion for network training-eliminating the need for a separate validation dataset. To demonstrate the practical relevance of our approach, we apply it to a network trained on data from an imaging experiment, highlighting its effectiveness in a realistic physical setting.

Country of Origin
🇦🇹 Austria

Page Count
17 pages

Category
Computer Science:
Machine Learning (CS)