Monday, 11 January 2016: 1:30 PM
Room 242 ( New Orleans Ernest N. Morial Convention Center)
Model benchmarking has typically been conducted using tools that describe the fidelity of the model's outputs compared with observations- that is, by measuring how close the model's residual error comes to zero. However, a model with small residual error in an output may not properly represent the internal process couplings, and as a result lack generality or suffer from excessive complexity. Models exhibit emergent functional structure that may not match the intended design. For example, a land surface flux model may or may not need to be run "online" with full two-directional coupling to the atmosphere, depending on the particulars. A means of benchmarking model performance is the comparison with observations of a model's internal functional structure, as represented by an Information Flow Dynamical Process Network (DPN), which is an application of Bayesian statistics. Using this technique, we can anticipate that a functionally adequate model DPN demonstrates favorable comparisons with respect to the observed DPN: (1) presence of information flow couplings between components, (2) information content of each coupling, (3) best DPN matches at realistic and field-measured parameter values, (4) switching on/off of DPN couplings at realistic forcing values, and (5) residual error is minimized when the DPN matches best. Model structure and parameterization can be chosen based on the DPN, rather than a traditional benchmark, or may balance the two. Additionally, we can explore a modeled DPN to evaluate the range of parameters and forcings in which the model remains functionally valid- and the thresholds at which the model's emergent functional structure fundamentally changes. This technique provides a fundamentally different way to benchmark and diagnose model performance.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner