TJ11.2 Geostationary Lightning Mapper Performance

Tuesday, 8 January 2019: 1:45 PM
North 231AB (Phoenix Convention Center - West and North Buildings)
Scott D. Rudlosky, NOAA/NESDIS/STAR, College Park, MD; and W. J. Koshak and P. Meyers

The Geostationary Lightning Mapper (GLM) is a new instrument undergoing extensive calibration and validation. The GLM performance requirements include full disk coverage, greater than 70% detection efficiency, flash false alarm rate less than 5%, and location accuracy within a half a pixel. The GLM appears to meet these performance requirements despite the examples illustrated herein. The instrument relies on the spacecraft position and pointing information along with a coastline identification and navigation procedure to convert the focal plane x, y to latitude and longitude coordinates. Comparisons with ground-based networks allow quantification of the collocation of these datasets. The GLM seeks to maximize detection efficiency while minimizing the false alarm rate (FAR). The FAR is defined as the rate of false flash detections divided by the average true flash rate. Each subarray is independently tuned to optimize the dynamic range and sensitivity, which vary based on the background scene. 56 Real Time Event Processors (RTEPs) are used to tune the GLM. The most common sources of false events are described along with efforts to mitigate these effects. Solar intrusion during the eclipse seasons provides an example of false events that can only be mitigated in the longer term. During certain days and times direct solar illumination nearly reaches the GLM focal plane, resulting in false detection artifacts that quickly bloom into massive numbers of false events. A blooming filter that quenches the rapid growth of artifacts associated with both sun glint and eclipse effects has been developed and is awaiting implementation.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner