Monday, 29 January 2024: 8:45 AM
345/346 (The Baltimore Convention Center)
Machine learning has quickly become a relevant method for weather forecasting across scales, from nowcasts to decadal climate scale projections. Pure AI forecasting methods require less computational resources, are significantly faster, and potentially more accurate than traditional numerical weather prediction. However, current AI forecasting approaches are dependent on the initial conditions generated from traditional data assimilation (DA). DA ingests observations from hundreds of data sources to find the “true” state of the system and is the most computationally expensive component of numerical weather prediction. The DA process takes anywhere between 1.5 to 5 hours for regional and global systems and must be completed before AI forecasting can be applied. Operational DA systems at NOAA and ECMWF use 3D/4D-Var to ingest observations into the background state which is modeled by a physics-based forecast and known to contain significant uncertainty to both short and long term forecasts. Our work argues that spatio-temporal generative models are capable of emulating both DA and forecasting steps in near real-time, with greater efficiency and while consuming fewer resources. We show that our approach can emulate NOAA’s High-Resolution Rapid Refresh (HRRR) from geostationary satellite imagery and coarse resolution analysis in ~2 minutes, representing nearly a 40x speedup in DA. We use our initial conditions to produce LENS-Cast, a novel short-term forecast of 20+ meteorological variables refreshed every 15-minutes, and present results against station level observations. Zeus AI operates LENS-Cast in near real-time on a single GPU and is being actively ingested into downstream commercial applications. The efficiency demonstrated by emulating DA supports the development of hyper-local, city scale weather forecasts.

