Score: 1

The Case for Instance-Optimized LLMs in OLAP Databases

Published: July 7, 2025 | arXiv ID: 2507.04967v1

By: Bardia Mohammadi, Laurent Bindschaedler

Potential Business Impact:

Makes smart computer questions faster and cheaper.

Plain English Summary

Imagine getting super-smart insights from your company's data, like understanding customer trends or finding problems, without needing a team of data scientists. This new system makes that possible by using AI to quickly sort, clean, and explain massive amounts of information, making it much cheaper and faster. This means businesses can get valuable answers from their data much more easily and affordably, leading to better decisions and new discoveries.

Large Language Models (LLMs) can enhance analytics systems with powerful data summarization, cleaning, and semantic transformation capabilities. However, deploying LLMs at scale -- processing millions to billions of rows -- remains prohibitively expensive in computation and memory. We present IOLM-DB, a novel system that makes LLM-enhanced database queries practical through query-specific model optimization. Instead of using general-purpose LLMs, IOLM-DB generates lightweight, specialized models tailored to each query's specific needs using representative data samples. IOLM-DB reduces model footprints by up to 76% and increases throughput by up to 3.31$\times$ while maintaining accuracy through aggressive compression techniques, including quantization, sparsification, and structural pruning. We further show how our approach enables higher parallelism on existing hardware and seamlessly supports caching and batching strategies to reduce overheads. Our prototype demonstrates that leveraging LLM queries inside analytics systems is feasible at scale, opening new possibilities for future OLAP applications.

Repos / Data Links

Page Count
5 pages

Category
Computer Science:
Databases