To begin to address these questions under the auspices of the Jefferson Project at Lake George, we developed a coupled system to physically model the atmospheric, hydrological, and hydrodynamic aspects of the lake and the surrounding region, which is now running operationally. They are associated with a real-time multi-modal observing system composed of in-situ sensors for atmospheric, stream and lake measurement that supports adaptive sampling driven by forecasted conditions from the models. These data, as appropriate, are used for model verification, and to improve initial conditions via data assimilation. Another goal for the coupled system is to drive ecological models that explain the interaction of flora and fauna in the lake as a food web. In addition, the coupled physical models drive geometric modelling to enable fixed and interactive visualizations of the watershed. To support both the modelling and the observing systems, there is an underlying cyberinfrastructure that includes a high-performance computing cluster and storage system, and data acquisition hardware and software. As such, this capability is a testbed for addressing related issues in other lake watersheds, estuaries and similar ecosystems.
Atmospheric forcing is considered for both lake circulation and runoff models, the latter of which hydrologically forces the lake. To address the former, we build upon the on-going work with IBM Deep Thunder. It is based, in part, on the ARW core of the Weather Research and Forecasting (WRF) community model. It is run operationally daily (initialized at 00 UTC) nested to 333m horizontal resolution for 36 hours for the watershed and 1-km for 72 hours with high vertical resolution in the lower boundary layer for regional coverage. The former operates in the very large eddy (VLES) regime with no boundary-layer parameterization.
For the runoff, a two-dimensional hydrological model has been implemented. For its daily update, it is driven operationally by the 333m-resolution weather model and has been scaled to utilize 10m-resolution topographic data for the watershed. It supports fully dynamic routing with flow driven by both precipitation and snow melt, including over 400 stream networks with a total length of over 1000 km. The model has been extended for the transport of dissolved salt, originating from road surfaces. One of the applications is to consider potential nutrient loading in the lake driven from storm water runoff.
For the lake circulation, we build upon the on-going work with IBM DeepCurrent, a three-dimensional, hydrodynamic model with a vertical hydrostatic approximation operationally implemented at high-resolution for Lake George (approximately 50 m horizontal, and 2.8 m vertical). It is based, in part, on the Environmental Fluid Dynamics Code (EFDC) community model. The model utilizes data derived from an high-resolution bathymetric survey of the lake, and has been extended to address chlorine ion transport as an indicator of water quality given the tendency of sodium to bind with soil en route to the lake. Operationally, it is driven by the aforementioned meteorological and hydrological models to produce a 36-hour forecast, once per day.
There is an inherent scale gap between each of these models and how a food web model would be driven, which needs to be addressed. The first aspect was the operational implementation of Deep Thunder at the aforementioned VLES scale. To consider issues with the sub-surface flow to properly inform the hydrological model, an off-line land-surface model (NOAH-MP) has been deployed. In order to better understand the movement of biota in the lake, a particle tracking model has been implemented by adapting the Stanford Unstructured Nonhydrostatic Terrain-following Adaptive Navier-Stokes Simulator (SUNTANS) community model. For this application, the lake is modelled at approximately 45m at its highest resolution.
We will present an overview of each of these models along with the results to date, including the model coupling and computing infrastructure required for operations. We will also discuss the automation for the coupled execution, including monitoring, visualization and validation. In addition, we will outline recommendations for future work.