Thursday, 23 June 2005: 10:30 AM
South Ballroom (Hilton DeSoto)
The ways in which climate data are being collected and used are changing rapidly. Data are being collected increasingly by automated electronic systems, on a wider variety of platforms, at higher temporal resolution, at more locations, and in more difficult and remote environments. Climate observations are being disseminated increasingly over near-real time communication networks, for more applications, to fill a wider range of needs, in an increasingly automated and numerical world. The development of the ASOS (Automated Surface Observing System), SNOTEL (Snowpack Telemetry), RAWS (Remote Automated Weather Station), Agrimet, innumerable mesonets, and the prospect of COOP modernization, all reflect the increased importance of electronic sensors, remote environments, and automated, real-time data delivery systems now and in the future. These shifts in the climate observation landscape present a number of fundamental challenges for traditional QC systems designed to work with data collected in populated areas by human observers with non-electronic devices, and disseminated over a period of months. This paper will demonstrate and explore some of these challenges, and attempt to form a new paradigm for data QC systems that meet today's needs. Specific issues that will be discussed include the following: (1) errors from electronic measurement systems are more often manifested as continuous drift, rather than categorical mistakes, requiring that measures of data validity be numerically continuous as well as deterministic (i.e., flagging); (2) observations from remote platforms in mountainous environments, such as SNOTEL, require a QC system that recognizes and accounts for high spatial complexity in the environment and uses all available data sources in its assessment; (3) computer models that use climate observations as input require quantitative estimates of observational uncertainty that are now largely unavailable; (4) the range of applications for climate data, and hence tolerance for outliers, is increasingly rapidly, requiring a transparent QC process with probabilistic information from which a decision of validity can be made by the user; (5) data generated by automated electronic systems are often more voluminous (e.g., shorter time step) and disseminated in a more timely manner than those from manual systems, favoring automated QC methods over those involving manual inspection. The first generation of a spatial QC system recently developed for USDA-NRCS SNOTEL temperature data will be presented as an example of a system that addresses some of the issues described above. Problems unique to the use of spatial methods in data QC will be discussed, such as the problem of assessing the quality of an observation by using observations of unknown quality at nearby stations.
Supplementary URL: http://www.ocs.oregonstate.edu/prism/
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner