Thursday, 27 January 2011: 12:00 PM
606 (Washington State Convention Center)
A major problem that data managers, scientists and users encounter when trying to serve, locate or use environmental data from NOAA is that NOAA's data management systems are insufficiently integrated. This situation is a reflection of technology and management and decision-making strategies of the past that have tended to fragment data management, rather than to unify it. Lines of funding have traditionally been matched to observing systems: satellites, ships, profilers, etc. and data life cycle phases: collection/measurement, real-time applications, climate analysis, archive, etc. Data management has been considered to be "owned" by the observing system element or the function. Each observing system element has therefore developed individualized approaches to data management, often involving the development of unique (and non-interoperable) data formats and protocols. Predictably, these traditions have hindered the development of integrated data management.
In this presentation, we will be discussing how the UAF project is attempting to overcome these hindrances through the use of several current de facto standards, such as netCDF, which provides the abstract data model, software libraries and a persistent binary format; the Climate and Forecast (CF) metadata conventions; the OPeNDAP protocol for web transport of data subsets; THREDDS XML catalogs which provide a distributed topology connecting data suppliers; and an OGC compatibility layer that provides access to the grids through WMS and WCS. We will be discussing the efforts to create a single-entry catalog showcasing vast amounts of data resources, from NOAA sources as well as various NOAA partners. We'll also be discussing the array of clients which are able to tap into this vast catalog and deliver data and data products seamlessly to the user.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner