Presentation PDF (2.6 MB)
In an effort to explore strategies for overcoming some of these limitations, the National Science Foundation funded in 2003 a 5-year, $11.25M project known as Linked Environments for Atmospheric Discovery (LEAD). Involving more than 100 researchers across nine institutions, LEAD has created an integrated, scalable system that allows for the operation of meteorological resources, and associated cyberinfrastructure, as dynamically adaptive, on-demand, grid-enabled systems that can a) change configuration rapidly and automatically in response to weather; b) respond to decision-driven inputs from users; c) initiate other processes automatically; d) steer remote observing technologies, such as Doppler radars, to optimize data collection for the problem at hand; and e) provide the fault tolerance necessary to achieve required levels of performance.
In this paper we present results from idealized numerical simulations of deep convective storms in which we pose the following question: Given fixed computational resources and a given weather scenario, what configuration (e.g., grid spacing, number of nested grids, domain size, start and stop times, observations, physics options, etc) of a numerical prediction model will yield the best or most optimal forecast? Our experiments are conducted with the WRF model, and a variety of objective and subjective measures are used to assess forecast quality and utility. By spanning a clearly defined parameter space, we are able to evaluate a cost function that can be used to automatically configure models in a dynamically adaptive framework such as LEAD.
Supplementary URL: http://portal.leadproject.org