Monday, 4 June 2018: 11:00 AM
Colorado B (Grand Hyatt Denver)
Downscaling simulations from climate model output and reforecasts from reanalysis datasets are common methods for understanding weather and climate impacts at local scales. The Weather Research and Forecasting (WRF) model is a community standard tool for producing these products, and to fully sample these datasets a large volume of simulations with WRF are typically required. The NSF Big Weather Web project recently demonstrated how containerization of WRF model codes allows WRF to be run at scale in distributed cloud-based computing environments. However, for these specific downscaling and reforecasting tasks, preparation of input data for WRF remains a challenge, as WRF has typically only accepted GRIB-formatted datasets of specific structures. We discuss the development of a containerized module that uses open-source Python libraries to process common climate and reanalysis datasets from the widely-used netCDF format directly into WRF-readable files. We further extend this capability to interface with Data Access Protocol (DAP) services that increasingly serve climate and reanalysis datasets in an on-demand manner. We show how using DAP services and this containerized chain allows the rapid production of WRF simulations without needing to download and reformat multiple complete datasets, quickly producing large datasets of downscaled weather. It is demonstrated on a precipitation downscaling experiment over the southeastern United States, which is used to provide input to hydrologic models to evaluate changes in flooding potential with climate change.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner