369730 Applying Deep Learning to Top-of-Atmosphere Radiance Simulation for VIIRS by Community Radiative Transfer Model

Wednesday, 15 January 2020
X. Liang, ESSIC/UMD, College Park, MD; and Q. Liu

The Community Radiative Transfer Model (CRTM), developed at Joint Center for Satellite Data Assimilation (JCSDA), is a fast radiative transfer model for calculations of radiances for satellite infrared or microwave radiometers. The radiance calculation together with polarization for visible and ultraviolet regions in CRTM is currently under development and testing. CRTM supports most of the operational and many research sensors and its simulation to satellite data has been employed in global data assimilation, validation of sensor data record (SDR), development of environment data record (EDR) and climate research at NOAA, NAVY and multiple institutes and universities in the national and international wide. With the development of high spatial and temporary resolution sensor, the efficiency of CRTM simulation become a key issue for global data assimilation and sensor validation, like the Visible Infrared Imaging Radiometer Suite (VIIRS) onboard the Joint Polar Satellite System (JPSS) satellites, and the Advanced Baseline Imager (ABI) onboard the Geostationary Operational Environmental Satellite-R (GOES-R). Particularly, for the simulation in visible bands, the efficiency of the atmosphere scattering is well-known issue in the radiative transfer model and remote sensing community.

With the development of the state-of-art artificial intelligence, the deep learning (DL) method gradually become popular algorithm and apply most of science and technical fields, including atmosphere and ocean remote sensing and climate research. The advantage of the DL method is that instead of complicated radiative transfer (RT) equation, the statistics nonlinear approximation is used to train and generate TOA radiance, it thus make the radiance calculation higher efficiency than RT model. In order to explore the efficiency and accuracy of DL application in CRTM, in this study, we designed and developed a DL algorithm applied to CRTM simulation for VIIRS five thermal emission M-bands (TEB/M). The GFS and ECMWF atmosphere profile and surface data were used as input and the CRTM brightness temperatures (BTs) for five VIIRS bands was used as label data. In the preliminary test, we compared two cases, including DL training for single band and multi-band. We found that the accuracy of the DL-generated BTs for the single band were better than the corresponding result in the case of multi-band training. But when introducing batch normalization into the DL algorithm, the accuracies for all bands were improved significantly and all generated BTs were comparable with the single band training. Final experiment showed that the means of the DL-generated BTs were less than 0.02K and the standard deviations were between 0.05 to 0.1 K for all five TEB/M bands in the global wide. The time used in DL calculation is 70 times faster than CRTM simulation for a non-GPU machine. In the future, we will explore the DL application to the multi-band simulation for the hyperspectral sensor - Cross-track Infrared Sounder (CrIS).

- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner