Tuesday, 12 January 2016: 4:15 PM
Room 225 ( New Orleans Ernest N. Morial Convention Center)
Jared Rennie, Cooperative Institute for Climate and Satellites/North Carolina State University, Asheville, NC; and
S. P. Lillo
The demand for weather, water, and climate information has been high, with an expectation of long, serially complete observational records in order to assess historical and current events in the Earth's system. While assessments have been championed through monthly and annual State of the Climate Reports produced at the National Centers for Environmental Information (NCEI, formerly NCDC), there is a demand for near-real time information that will address the needs of the atmospheric science community. The Global Historical Climatology Network – Daily data set (GHCN-D) provides a strong foundation of the Earth's climate on the daily scale, and is the official archive of daily data in the United States. The data set is updated nightly, with new data ingested with a lag of approximately one day. The data set adheres to a strict set of quality assurance, and lays the foundation for other products, including the 1981-2010 US Normals.
While a very popular data set, GHCN-Daily is only available in ASCII text or comma separated files, and very little visualization is provided to the end user. It makes sense then to build a suite of algorithms that will not only take advantage of its spatial and temporal completeness, but also help end users analyze this data in a simple, efficient manner. To that end, a Python package has been developed called GHCNPy to address these needs. Open sourced, GHCNPy uses basic packages such as Numpy, Scipy, and matplotlib to perform a variety of tasks. Routines include converting the data to CF compliant netCDF files, time series analysis, and visualization of data, from the station to global scale. Here we will present advancements and challenges of utilizing this data for public dissemination.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner