Score: 0

Hierarchical temporal receptive windows and zero-shot timescale generalization in biologically constrained scale-invariant deep networks

Published: January 6, 2026 | arXiv ID: 2601.02618v1

By: Aakash Sarkar, Marc W. Howard

Potential Business Impact:

Helps computers learn faster by understanding time.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Human cognition integrates information across nested timescales. While the cortex exhibits hierarchical Temporal Receptive Windows (TRWs), local circuits often display heterogeneous time constants. To reconcile this, we trained biologically constrained deep networks, based on scale-invariant hippocampal time cells, on a language classification task mimicking the hierarchical structure of language (e.g., 'letters' forming 'words'). First, using a feedforward model (SITHCon), we found that a hierarchy of TRWs emerged naturally across layers, despite the network having an identical spectrum of time constants within layers. We then distilled these inductive priors into a biologically plausible recurrent architecture, SITH-RNN. Training a sequence of architectures ranging from generic RNNs to this restricted subset showed that the scale-invariant SITH-RNN learned faster with orders-of-magnitude fewer parameters, and generalized zero-shot to out-of-distribution timescales. These results suggest the brain employs scale-invariant, sequential priors - coding "what" happened "when" - making recurrent networks with such priors particularly well-suited to describe human cognition.

Page Count
20 pages

Category
Quantitative Biology:
Neurons and Cognition