Hierarchical temporal receptive windows and zero-shot timescale generalization in biologically constrained scale-invariant deep networks
By: Aakash Sarkar, Marc W. Howard
Potential Business Impact:
Helps computers learn faster by understanding time.
Human cognition integrates information across nested timescales. While the cortex exhibits hierarchical Temporal Receptive Windows (TRWs), local circuits often display heterogeneous time constants. To reconcile this, we trained biologically constrained deep networks, based on scale-invariant hippocampal time cells, on a language classification task mimicking the hierarchical structure of language (e.g., 'letters' forming 'words'). First, using a feedforward model (SITHCon), we found that a hierarchy of TRWs emerged naturally across layers, despite the network having an identical spectrum of time constants within layers. We then distilled these inductive priors into a biologically plausible recurrent architecture, SITH-RNN. Training a sequence of architectures ranging from generic RNNs to this restricted subset showed that the scale-invariant SITH-RNN learned faster with orders-of-magnitude fewer parameters, and generalized zero-shot to out-of-distribution timescales. These results suggest the brain employs scale-invariant, sequential priors - coding "what" happened "when" - making recurrent networks with such priors particularly well-suited to describe human cognition.
Similar Papers
Learning Time-Scale Invariant Population-Level Neural Representations
Machine Learning (CS)
Makes brain-reading tools work better with different data.
Astrocyte-mediated hierarchical modulation enables learning-to-learn in recurrent spiking networks
Neural and Evolutionary Computing
Teaches computers to learn new things faster.
On Biologically Plausible Learning in Continuous Time
Machine Learning (CS)
Makes brains learn faster by timing signals.