Score: 0

Self-supervised Synthetic Pretraining for Inference of Stellar Mass Embedded in Dense Gas

Published: October 28, 2025 | arXiv ID: 2510.24159v1

By: Keiya Hirashima, Shingo Nozaki, Naoto Harada

Potential Business Impact:

Finds star sizes hidden in gas clouds.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Stellar mass is a fundamental quantity that determines the properties and evolution of stars. However, estimating stellar masses in star-forming regions is challenging because young stars are obscured by dense gas and the regions are highly inhomogeneous, making spherical dynamical estimates unreliable. Supervised machine learning could link such complex structures to stellar mass, but it requires large, high-quality labeled datasets from high-resolution magneto-hydrodynamical (MHD) simulations, which are computationally expensive. We address this by pretraining a vision transformer on one million synthetic fractal images using the self-supervised framework DINOv2, and then applying the frozen model to limited high-resolution MHD simulations. Our results demonstrate that synthetic pretraining improves frozen-feature regression stellar mass predictions, with the pretrained model performing slightly better than a supervised model trained on the same limited simulations. Principal component analysis of the extracted features further reveals semantically meaningful structures, suggesting that the model enables unsupervised segmentation of star-forming regions without the need for labeled data or fine-tuning.

Country of Origin
🇯🇵 Japan

Page Count
6 pages

Category
Astrophysics:
Astrophysics of Galaxies