12C.5 PrecipSRGAN: A Machine Learning Model for Satellite Precipitation Downscaling

Wednesday, 31 January 2024: 5:30 PM
339 (The Baltimore Convention Center)
Yongxin Liu, Colorado State University, Fort Collins, CO; and H. Chen and P. Xie

Precipitation products at high space and time resolutions are critical for applications such as flood monitoring and water resources management. However, the space resolutions of commonly used satellite precipitation products, such as the Integrated Multi-satellitE Retrievals for Global Precipitation Measurement (IMERG, 10-km resolution) or the NOAA/Climate Prediction Center morphing technique (CMORPH, 8-km resolution), are not sufficient for such applications. In this paper, we developed a generative adversarial network (GANs) based model, PrecipSRGAN, to create a super-resolution version of CMORPH, which has a spatial resolution of 4 km by 4 km. The super-resolution model uses the Stage IV quantitative precipitation estimates (QPEs) as references and takes advantage of the digital elevation model (DEM) to improve satellite-based precipitation feature extraction. During the training phase, we used a complex terrain region in Northern California as a demonstration study domain and collected seven years of precipitation data from 2016 to 2022 for training the model. Then, three independent precipitation events were selected to evaluate the model performance. In addition, the deep learning-based downscaling performance is compared with a linear interpolation method. Based on the testing events and ground references, the linear interpolation-based satellite precipitation product presents a correlation (CC), normalized mean error (NME), normalized mean absolute error (NMAE), and root mean square error (RMSE) of 0.19, -35%, 63%, and 92.8 mm, respectively, whereas the PrecipSRGAN shows better results of 0.84, -8%, 33%, 41.7 mm, respectively.

Keywords: Super resolution, GANs, satellite precipitation retrieval product

- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner