Wednesday, 14 January 2004: 8:45 AM
On the relationship between tropical mean radiation and SST
Room 608
This study compares the observed decadal Earth Radiation Budget Satellite (ERBS) nonscanner radiation anomaly with theoretical calculations of radiation changes caused by sea surface temperature (SST) anomalies from blackbody emission, radiative-convective models, and the Iris hypothesis.
The regression slope of ERBS LW with SST is about 4.6 W/m^2/K, which is similar to the theoretically predicted increase of LW due to blackbody emission (~4 W/m^2/K), and much larger than the predictions from 1D radiative-convective models with fixed relative humidity (~2.3 W/m^2/K). When the decadal LW anomaly is removed, the regression slope drops to ~2.1 W/m^2/K, indicating the tropical climate system may be close to the radiative-convective equilibrium in seasonal to interannual time scales.
The ERBS decadal LW anomaly (3.05 W/m^2) is much larger than the calculated changes in blackbody emission (~0.58 W/m^2) resulted from the small decadal variations of tropical mean SST (0.144K). Furthermore, these observed LW changes are also larger than those predicted by the radiative-convective equilibrium models (~0.33 W/m^2). Most of the observed LW anomaly, however, is balanced by shortwave (SW) radiation (2.4 W/m^2), resulting in the net outgoing radiation (0.65 W/m^2) closer to the theoretical values. On the decadal time scale, the ERBS measurements are generally significantly different from those of the Iris predicted tropical mean radiative flux anomalies, and do not support the strong negative feedback of the Iris effect.
Supplementary URL: