83rd Annual

Monday, 10 February 2003: 9:45 AM
Neural networks for nonlinear post-processing of model output
Caren Marzban, CAPS/Univ. of Oklahoma, Norman, OK and Univ. of Washington, Seattle, WA
Many numerical models employ some sort of statistical post-processing to improve their performance. There are several ways by which this post-processing can take place. Two relatively common methods in the atmospheric sciences are referred to as Perfect-Prog and MOS. Although they both have specific pros and cons, the latter is better suited to some of the more contemporary approaches. In MOS, one typically derives a set of linear regression equations that relate the output of the model at some time to actual observations of the corresponding quantities at that time.

Recently, nonlinear methods have also been used for the post-processing. Neural Networks (NNs) constitute one such method. Although the choice of a nonlinear method is not unique, NNs are generally useful because in addition to being able to approximate a large class of functions, they are less inclined to overfit data than some other nonlinear methods, given certain mild requirements. The latter is a consequence of a *linear* growth of the number of NN parameters with the number of independent variables. By contrast, the number of parameters in polynomial regression grows exponentially with the number of independent variables. This is not to imply that NNs are a panacea; they are simply one of the more "convenient" nonlinear models.

In this talk some general approaches to model post-processing will be reviewed. The review will be followed by a detailed and pedagogical example. The model underlying the example is the Advanced Regional Prediction System (ARPS). ARPS's temperature forecasts are post-processed by NNs. Specifically, 31 stations are considered, and for each a NN is developed. It is shown that the model temperature forecasts are improved in terms of a variety of performance measures. An average of 40% reduction in mean-squared error across all stations is accompanied by an average reduction in bias and variance of 70% and 20%, respectively.

Supplementary URL: