J5A.2 Towards Well-Calibrated Stochastic Neural Network Convective Parameterizations

Tuesday, 30 January 2024: 8:45 AM
345/346 (The Baltimore Convention Center)
Jerry Lin, University of California, Irvine, Irvine, CA, CA; and E. Wong-Toi, S. Mandt, and M. Pritchard

Neural network parameterizations hold tremendous potential for circumventing intractable computational complexity in subgrid processes for climate models. However, this promise is jeopardized by questionable reliability on out-of-distribution data. Because neural networks are data-driven, unphysical extrapolation resulting from learning spurious correlations may be inevitable. Unfortunately, deterministic neural networks that yield point estimates for predictions can only be diagnosed for unphysical extrapolation ex post facto with a more computationally expensive reference simulation—a non-starter if they are to be used operationally. While stochastic neural network parameterizations have been proposed to account for the chaotic nature of the dynamical systems they are designed to replace, comparatively little attention has been given to using their distributional predictions to detect unphysical extrapolation a priori. In this work, we investigate a variety of methods to ensure that these stochastic predictions are well-calibrated. In other words, predictions indicating wider distributions should correspond to inference on out-of-distribution data and vice versa. We show that this does not have to come at the cost of competitive offline fits and lay the groundwork for using this method to circumvent cases of instability online.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner