Omni-Weather: Unified Multimodal Foundation Model for Weather Generation and Understanding
By: Zhiwang Zhou , Yuandong Pu , Xuming He and more
Potential Business Impact:
Predicts and explains weather better together.
Weather modeling requires both accurate prediction and mechanistic interpretation, yet existing methods treat these goals in isolation, separating generation from understanding. To address this gap, we present Omni-Weather, the first multimodal foundation model that unifies weather generation and understanding within a single architecture. Omni-Weather integrates a radar encoder for weather generation tasks, followed by unified processing using a shared self-attention mechanism. Moreover, we construct a Chain-of-Thought dataset for causal reasoning in weather generation, enabling interpretable outputs and improved perceptual quality. Extensive experiments show Omni-Weather achieves state-of-the-art performance in both weather generation and understanding. Our findings further indicate that generative and understanding tasks in the weather domain can mutually enhance each other. Omni-Weather also demonstrates the feasibility and value of unifying weather generation and understanding.
Similar Papers
A Physics-guided Multimodal Transformer Path to Weather and Climate Sciences
Machine Learning (CS)
AI predicts weather better using different data.
WeatherPrompt: Multi-modality Representation Learning for All-Weather Drone Visual Geo-Localization
CV and Pattern Recognition
Helps drones see where they are in bad weather.
WeatherPrompt: Multi-modality Representation Learning for All-Weather Drone Visual Geo-Localization
CV and Pattern Recognition
Drones see location even in fog or rain.