Score: 1

Detail Across Scales: Multi-Scale Enhancement for Full Spectrum Neural Representations

Published: September 19, 2025 | arXiv ID: 2509.15494v1

By: Yuan Ni , Zhantao Chen , Cheng Peng and more

BigTech Affiliations: Stanford University

Potential Business Impact:

Stores detailed pictures using less computer space.

Business Areas:
Visual Search Internet Services

Implicit neural representations (INRs) have emerged as a compact and parametric alternative to discrete array-based data representations, encoding information directly in neural network weights to enable resolution-independent representation and memory efficiency. However, existing INR approaches, when constrained to compact network sizes, struggle to faithfully represent the multi-scale structures, high-frequency information, and fine textures that characterize the majority of scientific datasets. To address this limitation, we propose WIEN-INR, a wavelet-informed implicit neural representation that distributes modeling across different resolution scales and employs a specialized kernel network at the finest scale to recover subtle details. This multi-scale architecture allows for the use of smaller networks to retain the full spectrum of information while preserving the training efficiency and reducing storage cost. Through extensive experiments on diverse scientific datasets spanning different scales and structural complexities, WIEN-INR achieves superior reconstruction fidelity while maintaining a compact model size. These results demonstrate WIEN-INR as a practical neural representation framework for high-fidelity scientific data encoding, extending the applicability of INRs to domains where efficient preservation of fine detail is essential.

Country of Origin
🇺🇸 United States

Page Count
11 pages

Category
Computer Science:
Machine Learning (CS)