Self-supervised Synthetic Pretraining for Inference of Stellar Mass Embedded in Dense Gas
By: Keiya Hirashima, Shingo Nozaki, Naoto Harada
Potential Business Impact:
Finds star sizes hidden in gas clouds.
Stellar mass is a fundamental quantity that determines the properties and evolution of stars. However, estimating stellar masses in star-forming regions is challenging because young stars are obscured by dense gas and the regions are highly inhomogeneous, making spherical dynamical estimates unreliable. Supervised machine learning could link such complex structures to stellar mass, but it requires large, high-quality labeled datasets from high-resolution magneto-hydrodynamical (MHD) simulations, which are computationally expensive. We address this by pretraining a vision transformer on one million synthetic fractal images using the self-supervised framework DINOv2, and then applying the frozen model to limited high-resolution MHD simulations. Our results demonstrate that synthetic pretraining improves frozen-feature regression stellar mass predictions, with the pretrained model performing slightly better than a supervised model trained on the same limited simulations. Principal component analysis of the extracted features further reveals semantically meaningful structures, suggesting that the model enables unsupervised segmentation of star-forming regions without the need for labeled data or fine-tuning.
Similar Papers
From Simulations to Surveys: Domain Adaptation for Galaxy Observations
Astrophysics of Galaxies
Helps computers tell galaxy shapes from pictures.
SatDINO: A Deep Dive into Self-Supervised Pretraining for Remote Sensing
CV and Pattern Recognition
Helps satellites better understand Earth from space.
Diffusion for Fusion: Designing Stellarators with Generative AI
Machine Learning (CS)
Designs fusion power plants faster using AI.