Quantization for Vector Search under Streaming Updates
By: Ishaq Aden-Ali , Hakan Ferhatosmanoglu , Alexander Greaves-Tunnell and more
Large-scale vector databases for approximate nearest neighbor (ANN) search typically store a quantized dataset in main memory for fast access, and full precision data on remote disk. State-of-the-art ANN quantization methods are highly data-dependent, rendering them unable to handle point insertions and deletions. This either leads to degraded search quality over time, or forces costly global rebuilds of the entire search index. In this paper, we formally study data-dependent quantization under streaming dataset updates. We formulate a computation model of limited remote disk access and define a dynamic consistency property that guarantees freshness under updates. We use it to obtain the following results: Theoretically, we prove that static data-dependent quantization can be made dynamic with bounded disk I/O per update while retaining formal accuracy guarantees for ANN search. Algorithmically, we develop a practical data-dependent quantization method which is provably dynamically consistent, adapting itself to the dataset as it evolves over time. Our experiments show that the method outperforms baselines in large-scale nearest neighbor search quantization under streaming updates.
Similar Papers
Approximate Nearest Neighbor Search of Large Scale Vectors on Distributed Storage
Databases
Finds similar items in huge online lists faster.
B+ANN: A Fast Billion-Scale Disk-based Nearest-Neighbor Index
Databases
Finds information faster using smarter computer memory.
SAQ: Pushing the Limits of Vector Quantization through Code Adjustment and Dimension Segmentation
Databases
Finds similar things much faster and more accurately.