7B.4 “Big Data Assimilation” for 30-second-update 100-m-mesh Numerical Weather Prediction

Tuesday, 8 November 2016: 2:15 PM
Pavilion Ballroom West (Hilton Portland )
Takemasa Miyoshi, RIKEN, Kobe, Japan; and G. Y. Lien, M. Kunii, J. J. Ruiz, Y. Maejima, S. Otsuka, K. Kondo, H. Seko, S. Satoh, T. Ushio, K. Bessho, H. Tomita, S. Nishizawa, T. Yamaura, and Y. Ishikawa

A typical lifetime of a single cumulonimbus is within an hour, and radar observations often show rapid changes in only a 5-minute period. For precise prediction of such rapidly-changing local severe storms, we have developed what we call a “Big Data Assimilation” (BDA) system that performs 30-second-update data assimilation cycles at 100-m grid spacing. The concept shares that of NOAA’s Warn-on-Forecast (WoF), in which rapidly-updated high-resolution NWP will play a central role in issuing severe-storm warnings even only minutes in advance. The 100-m resolution and 30-second update frequency are a leap above typical recent research settings, and it was possible by the fortunate combination of Japan’s most advanced supercomputing and sensing technologies: the 10-petaflops K computer and the Phased Array Weather Radar (PAWR). The X-band PAWR is capable of a dense three-dimensional volume scan at 100-m range resolution with 100 elevation angles and 300 azimuth angles, up to 60-km range within 30 seconds. The PAWR data show temporally-smooth evolution of convective rainstorms. This gives us a hope that we may assume the Gaussian error distribution in 30-second forecasts before strong nonlinear dynamics distort the error distribution for rapidly-changing convective storms. With this in mind, we apply the Local Ensemble Transform Kalman Filter (LETKF) that considers flow-dependent error covariance explicitly under the Gaussian-error assumption. The flow-dependence would be particularly important in rapidly-changing convective weather. Using a 100-member ensemble at 100-m resolution, we have tested the Big Data Assimilation system in real-world cases of sudden local rainstorms, and obtained promising results. However, the real-time application is a big challenge, and currently it takes 10 minutes for a cycle. We explore approaches to accelerating the computations, such as using single-precision arrays in the model computation and developing an efficient I/O middleware for passing the large data between model and data assimilation as quickly as possible. In this presentation, we will present the most up-to-date progress of our Big Data Assimilation research.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner