Data-Efficient Neural Training with Dynamic Connectomes
By: Yutong Wu, Peilin He, Tananun Songdechakraiwut
Potential Business Impact:
Shows how computer brains learn by watching their activity.
The study of dynamic functional connectomes has provided valuable insights into how patterns of brain activity change over time. Neural networks process information through artificial neurons, conceptually inspired by patterns of activation in the brain. However, their hierarchical structure and high-dimensional parameter space pose challenges for understanding and controlling training dynamics. In this study, we introduce a novel approach to characterize training dynamics in neural networks by representing evolving neural activations as functional connectomes and extracting dynamic signatures of activity throughout training. Our results show that these signatures effectively capture key transitions in the functional organization of the network. Building on this analysis, we propose the use of a time series of functional connectomes as an intrinsic indicator of learning progress, enabling a principled early stopping criterion. Our framework performs robustly across benchmarks and provides new insights into neural network training dynamics.
Similar Papers
NeuroPathNet: Dynamic Path Trajectory Learning for Brain Functional Connectivity Analysis
Machine Learning (CS)
Tracks how brain connections change to find diseases.
Functional Connectivity Graph Neural Networks
Neural and Evolutionary Computing
Helps computers understand complex patterns in networks.
Connectome-Guided Automatic Learning Rates for Deep Networks
Neural and Evolutionary Computing
Teaches computers to learn like brains.