MoPEQ: Mixture of Mixed Precision Quantized Experts
By: Krishna Teja Chitty-Venkata, Jie Ye, Murali Emani
Potential Business Impact:
Makes big AI models smaller, faster, and cheaper.
Large Language and Vision Models using a Mixture-of-Experts (MoE) architecture pose significant challenges for deployment due to their computational and memory demands. Mixed Precision Quantization assigns different precisions to different layers of an LLM/VLM based on layer sensitivity and importance within the model. In this work, we propose a Post Training Quantization algorithm, MoPEQ, that assigns optimal bit width to each expert. Our method balances accuracy and model size by analyzing each expert's sensitivity using Hessian trace approximation instead of relying on the activation frequency of the expert. This per-expert granularity approach clusters similar experts to maintain model performance while reducing memory requirements. The experimental results on VLMEvalKit benchmark datasets using State-of-the-art VLMs Deepseek-VL2 -tiny, -small, -base, and MolmoE models demonstrate that our mixed precision quantized MoEs achieve competitive accuracy with substantial improvements in memory footprint compared to uniform-precision baseline methods. We perform a comprehensive study to analyze the impact of expert activation frequency and sensitivity using Hessian trace approximation at both layer-wise and model-wide expert precision allocation of 2, 3, and 4 bits to provide a thorough understanding of mixed precision quantization of VLM-MoEs.
Similar Papers
MxMoE: Mixed-precision Quantization for MoE with Accuracy and Performance Co-Design
Machine Learning (CS)
Makes smart computer brains run much faster.
MoEQuant: Enhancing Quantization for Mixture-of-Experts Large Language Models via Expert-Balanced Sampling and Affinity Guidance
Machine Learning (CS)
Makes smart computer brains smaller and faster.
MoQE: Improve Quantization Model performance via Mixture of Quantization Experts
Machine Learning (CS)
Helps AI work better on small devices.