92nd American Meteorological Society Annual Meeting (January 22-26, 2012)

Tuesday, 24 January 2012: 4:30 PM
A Proving Ground As a Foundation for Testbed Activities - A Path to Operations for Coastal Oceanographic Models and Their Components
Room 337 (New Orleans Convention Center )
Douglas R. Levin, Washington College, Chestertown, MD

The foundation for a “Proving Ground” that provides a path for adopting improved models and their requisite components into Federal Operational Centers is offered for consideration. The Proving Ground (PG) concept includes a modeling Testbed that evaluates and validates the advancement of physical and ecologically focused coastal oceanographic models, considers their coupling with atmospheric models, assesses model components and ultimately facilitates transition into (federal) operational centers. A mature PG coordinates a formal process for advancing a model, or its components, through a structured, well defined, timely and coordinated certification process. The concept of a Proving Ground was derived following review of the inaugural year of the U.S. IOOS/SURA modeling testbed whose overriding objective is to improve coastal oceanographic models, where “improved” has multiple connotations. For this Testbed, four teams were organized; three topical to address identified issues of concern perceived to have the highest probability of benefitting from the Testbed (Coastal Inundation, Shelf Hypoxia and Estuarine Hypoxia) and a fourth, cyberinfrastructure, designed to help the topical teams streamline their modeling process through data standardization, conversion, and skill assessment. In the following document components of a Proving Ground are described and defined and offered for review and comment. This PG is divided into four teams; Oversight, Standardization, Evaluation and Validation. A model, or a component in the PG will be considered transitioned into operations when it has been “certified”. This design is conceptual and offered for discussion and refinement. The Oversight Team coordinates all aspects of the Proving Ground. This team is the portal through which all work orders are generated and progress reports and certifications communicated. The Oversight includes synchronization, customer service, and linking of the model results to societal needs. It also envelopes URL centralization, training, education and outreach, lessons learned, and results messaging (both internal and external), programmatic organization, advocacy, and represents the gateway for academic innovation that may prove advantageous to the federal operational centers. The Standardization team creates common formats so that models, their components and outputs are universally usable and interchangeable. The following components are deemed candidate responsibilities for this team; standardizing criteria for certification of the model components, including, but not limited to sensor data, observations, data format, metadata, model output and visualization, term usage, interoperability, benchmarking and archiving. The Evaluation team evaluates models or model components for transition to operations. When the model or component has been evaluated and its improvements deemed significant for advancement it will move into the validation process. Evaluation might include sensor evaluation, data analysis and coupling observations with good science, and determining the resources needed to maintain and sustain the Proving Ground. When the model or component has been evaluated and its improvements deemed significant for advancement it will move into the Validation process. The Validation Team “validates” models or model components deemed ready for certification. A model that has previously been certified for Operations may be improved by, for example, the adjustment of the algorithm, inclusion of new data, or other means. This updated model version will require validation before it is (re)certified for Operations. The following items span both Validation and Evaluation; Model Uncertainty, Process Coupling, Model Coupling, Stability, Economics of Operation Cost/Benefit, Cybertoolbox, Hardware Synergy, Inter and Intra Model Comparisons, HPC Capacity and Skill Assessment. The Timeline & Streamline cut across all aspects of the Proving Ground organization. At regular intervals, to be determined by Standardization, each aspect of the Proving Ground should be independently reviewed to determine whether contraction, expansion, or redesign of the process can accelerate the process of moving a model or model component into operations. Streamlining may be recommended by any team within the Proving Ground and will be coordinated by the Oversight Team. Streamlining should always be accompanied by a cost/benefit analysis that shows the economic benefit of the improvement. Internal/External Independent Review oversight groups will be established that hold the authority to guide and facilitate each of the PG components. An internal review involves an introspective look at the processes being carried out within the four respective teams (oversight, standardization, evaluation, validation). An external review involves the recruitment of independent subject matter experts that periodically review nominated issues and general workings of the Proving Ground. Reviews may include discovery and recommended investigation of a perceived gap that might improve/streamline the Proving Ground process. The accompanying figure shows the team and sub-component organization of the Proving Ground.

Supplementary URL: