You Had One Job: Per-Task Quantization Using LLMs' Hidden Representations
By: Amit LeVi , Raz Lapid , Rom Himelstein and more
Potential Business Impact:
Makes big AI models run faster and smaller.
Large Language Models (LLMs) excel across diverse tasks, yet many applications require only limited capabilities, making large variants inefficient in memory and latency. Existing approaches often combine distillation and quantization, but most post-training quantization (PTQ) methods are task-agnostic, ignoring how task-specific signals are distributed across layers. In this work, we propose to use hidden representations that encode task-salient signals as a guideline for quantization. In order to fully utilize our innovative idea, this paper compares two new task-aware PTQ methods: Task-Aware Quantization (TAQ), which allocates bitwidths using task-conditioned statistics from hidden activations, and TAQO, which allocates precision based on direct layer sensitivity tests. From a small calibration set, these approaches identify task-relevant layers, preserving their precision while aggressively quantizing the rest. This yields stable task sensitivity profiles and efficient task-specialized models. Across models, TAQ and TAQO outperform the baselines; TAQ leads on Phi-4, while TAQO leads on Llama-3.1, Qwen3, and Qwen2.5. For instances, on Phi-4 it achieves 42.33 EM / 50.81 F1, far surpassing Activation-aware Weight Quantization (AWQ) (2.25 / 7.07), while remaining within < 1.0% of the original accuracy at lower average precision.
Similar Papers
Task-Circuit Quantization: Leveraging Knowledge Localization and Interpretability for Compression
Machine Learning (CS)
Keeps AI smart while using less computer memory.
Scaling Laws for Task-Stratified Knowledge in Post-Training Quantized Large Language Models
Computation and Language
Makes big AI models smaller without losing smarts.
Scaling Laws for Task-Stratified Knowledge in Post-Training Quantized Large Language Models
Computation and Language
Makes big AI models smaller without losing smarts.