Daily precipitation reports from two station networks over Korea and China are used to examine the sensitivity of network changes to the accuracy of the long-term trend detection. First, accuracy of gauge-based analyses in quantifying the temporal variations and frequency of heavy precipitation is examined using a three-year (2005-2007) data set of daily precipitation provided by the Korea Meteorological Agency (KMA). Analyzed fields of daily precipitation are computed by interpolating gauge reports from 1,000 combinations of stations sub-sampled from all available stations to mimic the network density over various global land areas and over different historical periods. The resulting analyses over a 0.25olat/lon grid box around Seoul are compared against the ‘ground truth' defined as the mean of station reports from all 28 gauges inside. The results confirmed assumptions by early work that gauge analysis error variance is proportional to the precipitation intensity and inversely proportional to the local station network. In particular, uncertainty increases greatly in estimating both the overall quantity and frequency of heavy rainfall from gauge-based analyses over 0.25olat/lon grid boxes with no reporting stations inside.
We then extended our sensitivity tests using daily precipitation reports from a relatively dense network over China for a 60-year period from 1951 to 2010. Gauge-based analyses over China are constructed using reports from all 2500 stations and 100 combinations of sub-sampled stations to simulate the impacts of station network changes to the detection of long-term trend from the resulting analyses. Inter-comparisons are performed among the analyses using all available gauges, fixed and changing networks. Uncertainties are quantified for the gauge-based analyses in association with the gauge network density and its changes over the data period. Details will be reported at the conference.
Supplementary URL: