Score: 0

Multidimensional Distributional Neural Network Output Demonstrated in Super-Resolution of Surface Wind Speed

Published: August 21, 2025 | arXiv ID: 2508.16686v1

By: Harrison J. Goldwyn , Mitchell Krock , Johann Rudi and more

Potential Business Impact:

Helps computers guess better by showing how sure they are.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Accurate quantification of uncertainty in neural network predictions remains a central challenge for scientific applications involving high-dimensional, correlated data. While existing methods capture either aleatoric or epistemic uncertainty, few offer closed-form, multidimensional distributions that preserve spatial correlation while remaining computationally tractable. In this work, we present a framework for training neural networks with a multidimensional Gaussian loss, generating closed-form predictive distributions over outputs with non-identically distributed and heteroscedastic structure. Our approach captures aleatoric uncertainty by iteratively estimating the means and covariance matrices, and is demonstrated on a super-resolution example. We leverage a Fourier representation of the covariance matrix to stabilize network training and preserve spatial correlation. We introduce a novel regularization strategy -- referred to as information sharing -- that interpolates between image-specific and global covariance estimates, enabling convergence of the super-resolution downscaling network trained on image-specific distributional loss functions. This framework allows for efficient sampling, explicit correlation modeling, and extensions to more complex distribution families all without disrupting prediction performance. We demonstrate the method on a surface wind speed downscaling task and discuss its broader applicability to uncertainty-aware prediction in scientific models.

Page Count
20 pages

Category
Computer Science:
Machine Learning (CS)