87th AMS Annual Meeting

Thursday, 18 January 2007: 11:15 AM
Improved Land Products from Multiple Satellite Data through Data Assimilation
212B (Henry B. Gonzalez Convention Center)
Shunlin Liang, University of Maryland, College Park, MD; and H. Fang, J. Townshend, and R. E. Dickinson
The land surface models for climate studies require extensive data for initialization, assimilation and validation, which have to be generated from remote sensing. The current land surface products from remote sensing are often discontinuous in both space and time. Their uncertainties have not well characterized, and there are even some products that are highly required by the land surface models but not produced routinely.

Estimating environmental variables from satellite observation is often an ill-posed inversion problem. We will present a new procedure to improve estimation of land surface variables through data assimilation. The general idea is to develop a forward, largely radiation transport, model calculation with parameters that are adjusted to optimally reproduce the multispectral and multiangular radiances observed by multiple satellite sensors. Such adjustments are usually made by identifying reasonably close “first guesses” for the model parameters based on the current land products and determining statistically optimum estimates of the parameters by giving appropriate weights to the first guesses versus addition of the error increments needed to get agreement with the observations.

In this study, we 1) produce spatially and temporally continuous land surface parameter fields (e.g., albedos, leaf area index); 2) generate the climatologies of these variables based on the plant functional types that are also improved using evidence reasoning through this study; 3) produce new products, such as broadband emissivity, incident solar radiation, new narrowband albedos (e.g., UV albedo); 4) characterize the uncertainties of these products through extensive validation using “ground-truths” and product inter-comparisons.

Supplementary URL: