Monday, 3 August 2015
Back Bay Ballroom (Sheraton Boston )
It has been well recognized that using a Gaussian function with a synoptic-scale de-correlation length to model the background error covariance in data assimilation can inadvertently (or severely) hamper the ability of the analysis to assimilate mesoscale (or small scale) structures. As a remedy to this problem, a superposition of Gaussians has been used for operational data assimilation at NCEP with increased computational cost, but mesoscale features are still overly smoothed and inadequately resolved in the analyzed incremental fields even in areas covered by remotely sensed high-resolution observations (such as those from operational weather radars). This raises a challenging issue on how to optimally assimilating high-resolution observations on the mesoscale and storm scale. Ideally and theoretically, if the background error covariance is exactly known and perfectly modeled in data assimilation, then all different types of observations can be optimally analyzed in a single batch at a single step. However, since the background error covariance is usually mostly unknown and often crudely modeled, a multi-step approach can be more effective and efficient than the single-step approach for assimilating various types of observations (including remotely sensed high-resolution observations) into a regional model or mesoscale model. In this study, such a multi-step approach is explored based on Bayesian estimation theory for idealized cases first and then designed for real-data applications with variation data assimilation and ensemble data assimilation.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner