Score: 1

Towards Test-time Efficient Visual Place Recognition via Asymmetric Query Processing

Published: December 15, 2025 | arXiv ID: 2512.13055v1

By: Jaeyoon Kim, Yoonki Cho, Sung-Eui Yoon

Potential Business Impact:

Lets phones find places using less power.

Business Areas:
Visual Search Internet Services

Visual Place Recognition (VPR) has advanced significantly with high-capacity foundation models like DINOv2, achieving remarkable performance. Nonetheless, their substantial computational cost makes deployment on resource-constrained devices impractical. In this paper, we introduce an efficient asymmetric VPR framework that incorporates a high-capacity gallery model for offline feature extraction with a lightweight query network for online processing. A key challenge in this setting is ensuring compatibility between these heterogeneous networks, which conventional approaches address through computationally expensive k-NN-based compatible training. To overcome this, we propose a geographical memory bank that structures gallery features using geolocation metadata inherent in VPR databases, eliminating the need for exhaustive k-NN computations. Additionally, we introduce an implicit embedding augmentation technique that enhances the query network to model feature variations despite its limited capacity. Extensive experiments demonstrate that our method not only significantly reduces computational costs but also outperforms existing asymmetric retrieval techniques, establishing a new aspect for VPR in resource-limited environments. The code is available at https://github.com/jaeyoon1603/AsymVPR

Repos / Data Links

Page Count
9 pages

Category
Computer Science:
CV and Pattern Recognition