Score: 2

Subjective Depth and Timescale Transformers: Learning Where and When to Compute

Published: November 26, 2025 | arXiv ID: 2511.21408v1

By: Frederico Wieser , Martin Benfeghoul , Haitham Bou Ammar and more

BigTech Affiliations: Huawei

Potential Business Impact:

Makes AI smarter by letting it skip boring parts.

Business Areas:
Semantic Search Internet Services

The rigid, uniform allocation of computation in standard Transformer (TF) architectures can limit their efficiency and scalability, particularly for large-scale models and long sequences. Addressing this, we introduce Subjective Depth Transformers (SDT) and Subjective Timescale Transformers (STT), two distinct architectures that leverage Bayesian surprise signals to dynamically route computation, learning where and when to compute within decoder-only TFs. SDT augments a decoder-only stack with alternating Decision and Dynamic layers: a Decision layer computes a full block 'posterior' and a lightweight 'prior,' while a Dynamic layer employs fixed-capacity Top-K routing based on Bayesian surprise (Expected and Unexpected Change), maintaining a static compute graph. STT extends this conditional computation to the temporal domain: a transition network predicts residual updates, forming a temporal 'change hypothesis' that informs a router to dynamically execute or bypass TF blocks for each token, managing KV-cache contributions. Both architectures exhibit the predicted shift from novelty to prediction driven gating over training, suggesting alignment with surprise based principles. While operating at reduced capacity, they offer preliminary insights into the compute-accuracy trade-offs of conditional computation. The proposed architectures establish a flexible framework for efficiency, reducing self-attention computation by 75% and KV-cache requirements by 50% within each compute skipping layer, setting a pathway for more efficient models.

Country of Origin
🇨🇳 🇬🇧 United Kingdom, China

Page Count
14 pages

Category
Computer Science:
Machine Learning (CS)