The 14th Conference on Hydrology

4B.6
ASSIMILATION OF REMOTELY-SENSED SOIL MOISTURE ESTIMATES IN A DISTRIBUTED SURFACE FLUX-HYDROLOGY MODEL USING A KALMAN FILTER

William L. Crosson, NASA/USRA/GHCC, Huntsville, AL; and C. A. Laymon, R. Inguva, and M. P. Schamschula

One of the key state variables in land surface-atmosphere interactions is soil moisture, the amount of which affects surface energy fluxes, runoff and the radiation balance. Soil moisture modeling is firmly grounded in theory but relies on parameter estimates which are inadequately measured at the necessarily fine model scales. Hence, model soil moisture estimates are imperfect and often drift away from reality through simulation time. Because of its spatial and temporal nature, remote sensing holds great promise for soil moisture estimation. Some success has been attained over the past quarter century in estimating soil moisture using passive and active microwave sensors, but progress has been slow. One reason for this is the scale disparity between remote sensing data resolution, the hydrologic process scale, and the model grid scale. Other factors impeding progress include vegetation cover and the penetration depth of microwave radiation. As a result, currently there is no comprehensive method for assimilating remote soil moisture observations within a surface hydrology model at watershed or larger scales. This paper describes the design of a measurement-modeling system for estimating the three-dimensional soil moisture distribution which incorporates remote microwave observations, a surface hydrology model, a radiative transfer model, scaling techniques, and Kalman filtering. This method recognizes that the appropriate time scales for soil hydrologic processes are depth-dependent. Upper layer (surface to 5-10 cm) soil moisture responds rapidly to external forcing, and the uncertainties associated with this layer are greater than for the deep soil; it is for this layer only that remote sensing is of potential benefit.
The procedure is described as follows:
* A distributed land surface flux-hydrology model is initialized with measured or climatological soil moisture and temperature profiles and driven by meteorological observations, estimating the vertical and lateral distribution of water.
* A radiative transfer model is applied on the model grid to estimate microwave brightness temperature (TB) at the frequency (typically L-band) at which remote observations are available.
* A statistical method is used to disaggregate intermittent remotely-sensed TB to the higher resolution of the model. The disaggregation scheme utilizes spatial information on near-surface soil moisture from the model and in situ moisture observations, if available. The assumption is that the model is capable of capturing the high-resolution spatial pattern of soil moisture but is subject to biases during long simulations, while microwave remote sensor data are useful for removing these biases, but have poor spatial resolution.
* A Kalman filter is applied, using model and observed brightness temperatures, to update model estimates of near-surface soil moisture. The Kalman filter serves to nudge the model estimates toward the remote measurements. Because of the limited emitting depth of microwave remote sensing, only the upper soil layers are adjusted directly by the Kalman update procedure. The remainder of the profile adjusts to the updated surface value through the model's soil water dynamics.
A description of the conceptual framework will be presented along with initial results. The results will illustrate application of the radiative transfer model and the Kalman filter, using point data (modeled soil moisture and temperature and ground-based remote sensing) based on data from the Southern Great Plains '97 field experiment.

The 14th Conference on Hydrology