Score: 2

Attention on the Sphere

Published: May 16, 2025 | arXiv ID: 2505.11157v1

By: Boris Bonev , Max Rietmann , Andrea Paris and more

BigTech Affiliations: NVIDIA

Potential Business Impact:

Helps computers understand 3D shapes better.

Business Areas:
Semantic Search Internet Services

We introduce a generalized attention mechanism for spherical domains, enabling Transformer architectures to natively process data defined on the two-dimensional sphere - a critical need in fields such as atmospheric physics, cosmology, and robotics, where preserving spherical symmetries and topology is essential for physical accuracy. By integrating numerical quadrature weights into the attention mechanism, we obtain a geometrically faithful spherical attention that is approximately rotationally equivariant, providing strong inductive biases and leading to better performance than Cartesian approaches. To further enhance both scalability and model performance, we propose neighborhood attention on the sphere, which confines interactions to geodesic neighborhoods. This approach reduces computational complexity and introduces the additional inductive bias for locality, while retaining the symmetry properties of our method. We provide optimized CUDA kernels and memory-efficient implementations to ensure practical applicability. The method is validated on three diverse tasks: simulating shallow water equations on the rotating sphere, spherical image segmentation, and spherical depth estimation. Across all tasks, our spherical Transformers consistently outperform their planar counterparts, highlighting the advantage of geometric priors for learning on spherical domains.

Country of Origin
🇺🇸 United States

Page Count
18 pages

Category
Computer Science:
Machine Learning (CS)