Thursday, 16 January 2020: 2:00 PM
255 (Boston Convention and Exhibition Center)
Handout (1.9 MB)
The radiometric performance monitoring is critical for the quality and fidelity for GOES-R Advanced Baseline Imager (ABI) L1b images and L2 products. It represents a considerable challenge for engineers, because a manual approach is no-longer possible for an instrument with 7856 independent, active detectors in 16 channels. The machine-learning is used to capture the diurnally repeating, time dependent trends in the ABI calibration data through data training, where a time dependent trend consists of a time dependent function and noise level. The data quality for instrument calibration datasets are characterized by a set of dimensionless metrics from the data training outputs. The detector quality assessment and anomaly detection in the instrument data are accomplished with a clustering technique during the post training analysis process. The results of this approach are presented with GOES-16 and 17 ABI calibration data. The GOES-17 data in the infrared (IR) channels show very different patterns from the GOES 16 data due to the different diurnal temperature profiles. A stitching algorithm has been developed to manage the data training for GOES-17 data in IR channels operated with single and double gain settings for different seasons, which allows use of a single machine learning model for two data patterns. The software implementation, the Advanced Intelligent Monitoring System (AIMS) tool, has been operational since 2016, and the experience shows that the machine learning approach is highly efficient and effective for anomaly detection, much quicker turnaround in troubleshooting, and provides significantly improved system resiliency.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner