Score: 2

Distilled-3DGS:Distilled 3D Gaussian Splatting

Published: August 19, 2025 | arXiv ID: 2508.14037v1

By: Lintao Xiang , Xinkai Chen , Jianhuang Lai and more

Potential Business Impact:

Makes 3D pictures use less space.

Business Areas:
Image Recognition Data and Analytics, Software

3D Gaussian Splatting (3DGS) has exhibited remarkable efficacy in novel view synthesis (NVS). However, it suffers from a significant drawback: achieving high-fidelity rendering typically necessitates a large number of 3D Gaussians, resulting in substantial memory consumption and storage requirements. To address this challenge, we propose the first knowledge distillation framework for 3DGS, featuring various teacher models, including vanilla 3DGS, noise-augmented variants, and dropout-regularized versions. The outputs of these teachers are aggregated to guide the optimization of a lightweight student model. To distill the hidden geometric structure, we propose a structural similarity loss to boost the consistency of spatial geometric distributions between the student and teacher model. Through comprehensive quantitative and qualitative evaluations across diverse datasets, the proposed Distilled-3DGS, a simple yet effective framework without bells and whistles, achieves promising rendering results in both rendering quality and storage efficiency compared to state-of-the-art methods. Project page: https://distilled3dgs.github.io . Code: https://github.com/lt-xiang/Distilled-3DGS .

Repos / Data Links

Page Count
12 pages

Category
Computer Science:
CV and Pattern Recognition