Hierarchical Mamba Meets Hyperbolic Geometry: A New Paradigm for Structured Language Embeddings
By: Sarang Patil , Ashish Parmanand Pandey , Ioannis Koutis and more
Potential Business Impact:
Helps computers understand language's hidden layers.
Selective state-space models have achieved great success in long-sequence modeling. However, their capacity for language representation, especially in complex hierarchical reasoning tasks, remains underexplored. Most large language models rely on flat Euclidean embeddings, limiting their ability to capture latent hierarchies. To address this limitation, we propose Hierarchical Mamba (HiM), integrating efficient Mamba2 with exponential growth and curved nature of hyperbolic geometry to learn hierarchy-aware language embeddings for deeper linguistic understanding. Mamba2-processed sequences are projected to the Poincare ball (via tangent-based mapping) or Lorentzian manifold (via cosine and sine-based mapping) with "learnable" curvature, optimized with a combined hyperbolic loss. Our HiM model facilitates the capture of relational distances across varying hierarchical levels, enabling effective long-range reasoning. This makes it well-suited for tasks like mixed-hop prediction and multi-hop inference in hierarchical classification. We evaluated our HiM with four linguistic and medical datasets for mixed-hop prediction and multi-hop inference tasks. Experimental results demonstrated that: 1) Both HiM models effectively capture hierarchical relationships for four ontological datasets, surpassing Euclidean baselines. 2) HiM-Poincare captures fine-grained semantic distinctions with higher h-norms, while HiM-Lorentz provides more stable, compact, and hierarchy-preserving embeddings favoring robustness over detail.
Similar Papers
HMamba: Hyperbolic Mamba for Sequential Recommendation
Information Retrieval
Helps websites show you what you'll like next.
DH-Mamba: Exploring Dual-domain Hierarchical State Space Models for MRI Reconstruction
Image and Video Processing
Makes blurry MRI scans clear faster.
HELM: Hyperbolic Large Language Models via Mixture-of-Curvature Experts
Machine Learning (CS)
Makes AI understand words and ideas better.