Radiosondes are used to collect high-resolution measurements of the atmosphere, from the ground to approximately 30 kilometers. Carried by a large helium-filled balloon, they ascend upward through the atmosphere measuring pressure, temperature, relative humidity, and GPS winds at a one-second rate. Radiosondes, like dropsondes, can have their own set of unique problems that may affect data quality. Some of these include: sensor arm heating at the surface prior to launch, artificial dry spikes caused by slow ascent and inadequate ventilation of the sensors, descent of the balloon caused by icing or severe vertical downdrafts, malfunctioning of the system caused by a weakening of the radiosonde signal, and offsets in one of the two hygrometer measurements.
In recent years, advancements in atmospheric research, technology and data assimilation techniques have contributed to driving the need for high quality, high resolution radiosonde and dropsonde data. These data represent a valuable resource for initializing numerical prediction models, calibrating and validating satellite retrieval techniques for atmospheric profiles, and for climatological research. Each year the Earth Observing Laboratory (EOL) at the National Center for Atmospheric Research (NCAR) deploys its own radiosonde and dropsonde sounding systems for use in numerous scientific field campaigns. During the last twenty-one years EOL has collected 17,426 radiosonde profiles and 8077 dropsonde profiles during 104 field campaigns. All of these soundings have undergone internal quality control (QC) processing at NCAR, and over the years we have developed a unique set of QC procedures that ensure research quality data are obtained. The QC scheme is an extensive, multi-step process that includes, but is not limited to: (1) individual examination and correction of raw data profiles; (2) processing of the data through the Atmospheric Sounding Processing Environment (ASPEN) software, a QC software package developed at NCAR; (3) evaluating the data products using skew-t diagrams, histograms, time series plots and other visualization tools and statistical methods; (4) applying further corrections when necessary. These measures enable us to identify, characterize, and in many cases correct significant errors that could potentially impact research and analyses performed using these data.