Papers
arxiv:2512.21643

Omni-Weather: Unified Multimodal Foundation Model for Weather Generation and Understanding

Published on Dec 25
· Submitted by
YuandongPu
on Dec 29
Authors:
,
,
,
,
,
,
,
,
,
,
,

Abstract

Omni-Weather is a multimodal foundation model that integrates weather generation and understanding using a shared self-attention mechanism and a Chain-of-Thought dataset to enable interpretable, high-quality outputs.

AI-generated summary

Weather modeling requires both accurate prediction and mechanistic interpretation, yet existing methods treat these goals in isolation, separating generation from understanding. To address this gap, we present Omni-Weather, the first multimodal foundation model that unifies weather generation and understanding within a single architecture. Omni-Weather integrates a radar encoder for weather generation tasks, followed by unified processing using a shared self-attention mechanism. Moreover, we construct a Chain-of-Thought dataset for causal reasoning in weather generation, enabling interpretable outputs and improved perceptual quality. Extensive experiments show Omni-Weather achieves state-of-the-art performance in both weather generation and understanding. Our findings further indicate that generative and understanding tasks in the weather domain can mutually enhance each other. Omni-Weather also demonstrates the feasibility and value of unifying weather generation and understanding.

Community

Paper author Paper submitter

Submit Omni-Weather

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2512.21643 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2512.21643 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2512.21643 in a Space README.md to link it from this page.

Collections including this paper 1