J65.4 Distributed Workflow for WRF Processes and Visualization using WRF-Python and Dask

Thursday, 16 January 2020: 11:15 AM
258C (Boston Convention and Exhibition Center)
Robert C Fritzen, Northern Illinois University, DeKalb, IL; and V. A. Gensini, S. Collis, and R. Jackson

The Weather Research and Forecasting (WRF) model is a complex application that can ingest various data sources and controls for parametrizations for physical properties of the atmosphere. Running the WRF model involves collecting data, setting up model parameterization and control files, and sequentially running applications to generate NetCDF output files. These model tasks have numerous controls and can vary across machines and data sources based on several factors. A python package was developed to automate WRF from end-to-end (IE: data collection and pre-processing through post-processing model output). This package is highly flexible and offers simple text-oriented control files for WRF parameters and post-processing visualization. For most cases, this python package will perform data collection, submit jobs to high performance computing (HPC) clusters and monitor the progress of these jobs to automatically submit the next, and post-process model output files with Python (distributed processing using Dask and wrf-python) or the Unified Post Processor (UPP) application. Two end-to-end use cases on Argonne National Laboratory’s Theta Cray XC-40 cluster are presented to show different data sources. First, the 2019-05-26 convective event for Chicago, IL using the North American Regional Reanalysis (NARR) dataset as input for WRF. Second, the 2011-04-27 tornado outbreak in the southeastern United States using the Climate Forecast System (CFSv2) dataset as input for WRF.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner