723 GOES-16 Calibration and Validation for ABI Level 2 Algorithms

Tuesday, 9 January 2018
Exhibit Hall 3 (ACC) (Austin, Texas)
Paul A. Van Rompay, Atmospheric and Environmental Research, Inc., Greenbelt, MD; and E. J. Kennelly

With the GOES-16 launch in November 2016, we enter a new era of advanced geostationary imagery and space weather instruments for the hemisphere covering the Americas. The initial high-resolution images, space weather data, and lightning maps have begun to show the promising potential of GOES-16. After the first data were received, the GOES-R program transitioned into a 9-month validation effort addressing accuracy of the ABI L1b and L2 products, generated by the baselined science algorithms running in the Ground System. The evaluation of L2 products began in January 2017 and covers a wide range of products: imagery, cloud properties, air quality, volcanic ash, fire detection, snow cover, winds, rainfall rate, radiation budget, atmospheric profiles, land surface temperature, and sea surface temperature. In order to cover all these products, the L2 validation includes an additional 9-month validation period following the assignment of GOES-16 to the operational GOES-East position in November 2017.

For the science products, the Post-Launch Product Testing (PLPT) effort is geared towards three maturity-level milestones: Beta, Provisional, and Fully-Validated. The validation efforts are led and performed by the Algorithm Working Group (AWG) product science teams as part of the Calibration and Validation Coordination Team (CVCT), which is managed by the Product Readiness and Operations (PRO) team. As issues are uncovered through this validation effort, Algorithm Defect Reports (ADRs) are written to track the issue and develop/implement appropriate resolutions. Depending on the nature of the issue, actions may be taken that require algorithm/ground system parameter updates, software updates, or enhancements. Analysis using offline tools (e.g., Algorithm Test Tools, GFTS PPZ executions) can be helpful to diagnose problems and verify results prior to ground system implementation. In addition, prior to making any change to the Operational Environment (OE), algorithm updates are tested in the Development Environment (DE), with side-by-side comparison of DE-vs-OE datasets used to evaluate the changes. For this paper, we will highlight specific examples of the ADR process from the detection of the problem to the resolution of the issue in the operational GOES-16 products. We will focus on the use of offline tools, specifically Algorithm Test Tools, to develop algorithm corrections and transition them to the DE. One example will highlight an algorithm parameter update, another a code update, and another a configuration change involving input data. Calibration updates and corrections continue to improve the GOES-16 L2 products, preparing them for operational use in the next era of weather-based forecasting and research.

- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner