Dynamic Subspace Composition: Efficient Adaptation via Contractive Basis Expansion
By: Vladimer Khasia
Mixture of Experts (MoE) models scale capacity but often suffer from representation collapse and gradient instability. We propose Dynamic Subspace Composition (DSC), a framework that approximates context-dependent weights via a state-dependent, sparse expansion of a shared basis bank. Formally, DSC models the weight update as a residual trajectory within a Star- Shaped Domain, employing a Magnitude-Gated Simplex Interpolation to ensure continuity at the identity. Unlike standard Mixture-of-LoRAs, which incurs O(M rd) parameter complexity by retrieving independent rank-r matrices, DSC constructs a compositional rank-K approximation from decoupled unit-norm basis vectors. This reduces parameter complexity to O(M d) and memory traffic to O(Kd), while Frame-Theoretic regularization and spectral constraints provide rigorous worst-case bounds on the dynamic update. The code is available at https://github. com/VladimerKhasia/DSC
Similar Papers
Mixture-of-Experts with Gradient Conflict-Driven Subspace Topology Pruning for Emergent Modularity
Machine Learning (CS)
Helps AI learn better without forgetting or needing instructions.
Label-independent hyperparameter-free self-supervised single-view deep subspace clustering
CV and Pattern Recognition
Finds hidden groups in data without needing labels.
Exploring a Principled Framework for Deep Subspace Clustering
CV and Pattern Recognition
Finds hidden patterns in messy data.