Wednesday, 6 June 2018: 5:00 PM
Colorado B (Grand Hyatt Denver)
The random forest (RF) algorithm is implemented to develop skillful, calibrated contiguous United States (CONUS)-wide probabilistic forecasts of severe weather at Days 1-5. Models are trained with predictands highly analogous to those used in the generation of Storm Prediction Center (SPC) convective outlooks (COs), with 40 km neighborhood probabilities of tornado, severe hail, and severe convective wind probabilities trained in separate models for the current day, and all severe weather treated collectively in a single model for Day 2 and beyond. CONUS is partitioned into three regions—the West, Central, and East—and separate models are developed for each combination of region and forecast lead time. Forecasts are produced for each forecast point on a coarse (~0.5°) grid, each day in an 11-year historical period of record spanning 2003-2013. Predictor data used to generate forecast probabilities come from an assortment of simulated atmospheric fields taken from a record of NOAA’s Second Generation Global Ensemble Forecast System Reforecast (GEFS/R) 11-member ensemble system, including instability measures, wind profiles and shear, and moisture metrics. For each field used, model forecast data is taken relative to each forecast point in space, and for each output time step over the given 24-hour forecast interval. Trained models are evaluated to investigate insights illuminated by the RFs about effectively using a global ensembles to predict severe weather in the medium-range. Forecasts are generated from the RFs in a quasi-real-time setting over an extended period, and evaluated alongside operational SPC COs. Overall, the methodology exhibits considerable capability to predict severe weather considering the relatively long lead time of the forecasts made, and for later times are competitive with SPC COs.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner