Handout (3.2 MB)
The spatial and temporal consistency of long-term data sets is particularly important for modeling and is challenging for many reasons. We have gage-only MAP estimates back to the 1950s but our MAPX estimates incorporating radar information did not start until the late 1990s. While radar-based, gridded estimates provide greater spatial and temporal coverage by removing gaps between precipitation gages, other errors in gridded fields can cause consistency problems. For example, analysis at MARFC shows that cumulative MAPX estimates are often lower than cumulative MAP estimates farther from the radar, particularly during the cold season. This is primarily due to radar beam overshooting. Enhancements to gridded precipitation software over time also result in temporal inconsistencies in our data records.
The Analysis of Record for Calibration (AORC), recently made available to RFCs from the NWS Office of Water Prediction, was developed to meet long-term consistency requirements associated with hydrologic model calibration. This high resolution dataset spans 1979 to near-present and integrates various sources of radar, gage, and satellite data at varying intervals in space-time. Here we use double-mass analysis to examine the spatial and temporal consistency of AORC, MAPX, and MAP for precipitation, as well as AORC and MATs (from multiple sources) for temperature. Subsequently, we selectively combine the products to create a merged ‘best’ historical time series set to force our hydrologic models. The merged time series provide MAPs and MATs with improved temporal consistency at various forecast points within the Mid-Atlantic hydrologic service area. Using the merged time series, we can generate improved inputs for our ensemble forecasts and generate improved parameters when we recalibrate our hydrologic models. Our presentation will include examples of these improvements.