112 Uncertainty in Microphysical Models and the Development of a Bayesian Approach for Statistical-Physical Parameterization of Warm Microphysics

Monday, 9 July 2018
Regency A/B/C (Hyatt Regency Vancouver)
Hugh Morrison, NCAR, Boulder, CO; and M. van Lier-Walqui, M. R. Kumjian, and O. P. Prat

Current microphysical models are hampered by uncertainty in process rate formulations and parameter settings as well as numerical challenges. This is particularly true for ice microphysics owing to the complexity of ice particle shapes and types, although liquid microphysics also remains uncertain (e.g., drop collision-coalescence and breakup). Increasing model complexity has generally not reduced this uncertainty; recent model intercomparison studies (vanZanten et al. 2011, JAMES; Xue et al. 2017, MWR) have shown that simulation spread using different bin microphysics schemes is comparable to or even greater than the spread using different bulk schemes. This has precluded the development of benchmark models that could serve as a “ground truth” for developing and testing microphysics schemes, unlike for other physics parameterizations such as line-by-line radiation models. Since there is limited theoretical understanding of many microphysical processes, cloud and precipitation observations are essential for developing and improving these schemes. There are now a wealth of observations available for model constraint, but these generally measure cloud and precipitation characteristics that evolve through the net effect of several microphysical and dynamical processes rather than the process rates themselves. Thus, model developers are faced with the challenge of combining limited process-level, theoretical knowledge with indirect observational information in order to improve microphysics schemes.

In the field of statistics, the incorporation of new observational information within an existing model with some level of uncertainty is posed as a probabilistic problem, where each piece of information is described by a probability density function. When posed in this way, Bayes' theorem defines the solution. We suggest that the parameterization of microphysics should be viewed as a Bayesian problem. In other words, given the fundamental process-level uncertainty associated with microphysics combined with the considerable amount of observational data now available, we argue it is worthwhile to add elements from statistics explicitly within microphysics schemes in order to rigorously constrain them with observations. Based on this idea, we have developed a novel bulk microphysics scheme to simulate warm precipitating clouds, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS). Unlike other schemes, BOSS treats parameters and scheme structure alike as flexible, and subject to constraint by observations or prior knowledge. No assumptions are made regarding the structural or functional form of the drop size distribution, or the mathematical form of process rates, which are instead generalized as a sum of power laws. This flexibility allows for the complexity of BOSS to be tailored to any set of observational constraint and underlying theoretical knowledge. For example, BOSS can systematically vary the number and choice of prognostic DSD moments (single-, double-, triple-moment, etc.), and can also vary the number of power law terms used to model microphysical process rates. Here we investigate constraint of BOSS via synthetic polarimetric radar observations within an idealized one-dimensional rain shaft model. Special attention is given to comparing versions of BOSS with varying levels of complexity, and to what extent estimates of uncertainty in BOSS capture errors associated with inadequate microphysical models. Broader implications for using uncertain schemes to investigate microphysical processes within a rain shaft will be discussed.

- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner