Presentation PDF (793.2 kB)
The DTC objective evaluation during the HWT 2010 Spring Experiment (SE2010) complemented the subjective evaluation that has traditionally taken place. With the addition of probabilistic verification capabilities in the DTC's Model Evaluation Tool (MET), both probabilistic products and deterministic forecasts were evaluated this year. DTC evaluated output from the CAPS Storm Scale Ensemble Forecast (SSEF), the NOAA/ESRL/GSD High Resolution Rapid Refresh (HRRR), and the North American Mesoscale (NAM) model and the Short Range Ensemble Forecast System (SREF), both produced by NOAA/NCEP/EMC. The evaluation focus was on products derived from the simulated reflectivity and quantitative precipitation forecast (QPF) fields.
It is anticipated that both the subjective and objective evaluations performed in near-real time at the SE2010 will eventually lead to greater use of latest convection-allowing model forecasts by the NOAA/NWS Storm Prediction Center, NOAA/NWS Hydrometeorological Predication Center (HPC) and NOAA/NWS Aviation Weather Center (AWC). This talk will describe the DTC objective evaluation performed during the 2010 Spring Experiment, highlight key results, and describe anticipated future work. A companion paper (Harrold et. Al. 2010) will explore the evaluation of the simulated reflectivity field using the DTC MET spatial verification package called Method for Object-Based Diagnostic Evaluation (MODE).