Tuesday, 8 January 2019: 11:30 AM
North 131C (Phoenix Convention Center - West and North Buildings)
Terabytes of weather data are generated every day by gridded model simulations and in-situ and remotely-sensed observations. With the accelerating accumulation of weather data, efficient computational solutions are needed to process, archive, and analyze the massive data sets. This work demonstrates how we use object-based storage technology to archive multiple years of the High-Resolution Rapid Refresh (HRRR) model run operationally by the Environmental Modeling Center of the National Centers for Environmental Prediction and how we use the Open Science Grid (OSG) to compute large sets of empirical cumulative distributions from the archived hourly gridded analyses. The OSG is a consortium of computer resources around the United States that makes idle computer resources available for use by researchers in diverse scientific disciplines. The OSG is designed for high-throughput computing, i.e., many parallel, independent computational tasks. These preliminary cumulative distributions derived from a three-year HRRR archive are computed for seven variables, over 1.9 million grid points, and each hour of the calendar year. The cumulative distributions are used to evaluate techniques that may be appropriate to discriminate between typical and atypical atmospheric conditions in a historical context for situational awareness of hazardous weather conditions. With case studies of recent wildland fires, we illustrate how the techniques developed could be applied to asses future fire weather conditions.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner