11th Conference on Satellite Meteorology and Oceanography

P2.32

Standard Errors of the estimated trend in channel 2 of the microwave sounding unit

David S. Crosby, NOAA/NESDIS/ORA, Camp Springs, MD; and M. D. Goldberg, T. Mo, and Z. Cheng

Data from the Microwave Sounding Unit (MSU) instruments on the NOAA polar-orbiting environmental satellites have been used to estimate trends in global temperatures. A new improved calibration for this instrument has been developed and tested, and a data set of MSU observations based on this calibration is now available for study. A subject of primary interest is the standard error of the estimated slope of a temperature trend line. To obtain an estimate of the standard error of the estimated trend and to simplify the analysis, a subset of the data has been studied. This set consists of monthly global averages of the MSU channel 2 temperatures from the NOAA-10 and NOAA-12 satellites over a 12 year period which covers November 1986 to October 1998. These satellites are both morning satellites and have little drift in their equator crossing time. We use only nadir and near nadir measurements, and we use a three month overlap period between the two satellites and a stable five year period of overlap between NOAA-10 and NOAA-12 with NOAA-11 for making satellite- to-satellite offset adjustments. These features eliminate many of the difficulties which have complicated the analysis of the errors in the estimated trends.

The natural signals in the deseasonalized time series from Volcanic, ENSO and solar forcings are partially removed by a multiple regression procedure. Analysis of the residuals from the multiple regression shows that the errors are not independent, which implies that the usual statistical tests and estimated standard errors are incorrect. The noise in the time series is modeled with a first order autoregressive process and the parameters are estimated with the Hildeth-Lu procedure. The trend for this 12 year period is found to be 0.067 C per decade with a standard error of 0.062 C per decade. This gives an approximate 95 percent confidence interval for the trend to be (-.058,.+0.192) C per decade. The trend is not statistically significantly different from 0.0 at any reasonable level of significance. However, a simple analysis of the errors shows that if a 20-year time series of this quality were available, a linear trend of the order of 0.1 C per decade should be detectable.

Poster Session 2, Climatology and Long-term Satellite Studies
Monday, 15 October 2001, 2:15 PM-4:00 PM

Previous paper  Next paper

Browse or search entire meeting

AMS Home Page