J4.2 Exploring the Use of Artificial Intelligence (AI) to Optimize the Exploitation of Big Satellite Data in NWP and Nowcasting

Thursday, 10 January 2019: 8:45 AM
North 221C (Phoenix Convention Center - West and North Buildings)
Sid Ahmed Boukabara, NOAA/NESDIS, College Park, MD; and E. Maddy, N. Shahroudi, R. N. Hoffman, T. Connor, S. Upton, A. Karpovich, C. Sprague, and K. Kumar

Handout (6.4 MB)

One of the most daunting challenges facing the exploitation of data and especially satellite data, is the limit imposed by the computational resources to process the significant volume of data available. It is estimated that current NWP models for instance only process/assimilate between 1 and 5% of the satellite data due to aggressive thinning due to computational limitations, and this percentage is likely to decrease given the expected increase in satellites, associated data volume and new observing systems emerging (the internet of things). In light of several developments in the recent past regarding techniques involving machine learning, sometimes called cognitive learning, deep learning or artificial intelligence, the explosive volume of data currently being received from satellites (a trend only expected to grow in the future) from NOAA and NOAA sources, and the new NESDIS Strategic Plan and its implementation plan calling for new numerical approaches to exploit data more efficiently, this study represents the center for satellite applications and research (STAR)’s effort to explore using these Artificial Intelligence –based techniques which exploit all data efficiently.

This study describes various machine learning-based approaches (e.g., deep feed forward, convolutional and recurrent networks, Gaussian Process Regression and other kernel methods, etc.) as applied to microwave and infrared (both polar and geo) remote sounding, data assimilation and fusion between NWP forecast and satellite/in situ observations, and predictive tropical cyclone tracking/intensity estimation. Google’s TensorFlow™, Keras, Scikit-Learn and other open-source Python machine learning frameworks are used to build and train machine learning techniques and estimate target parameters using real and simulated dataset inputs and outputs.

In this presentation we will show results of proof of concept algorithms for the above application areas. Our research shows that Deep Neural Network (DNN) inversions of temperature and water profiles, cloud parameters, surface temperature, and spectral emissivity from polar and geo MW and IR satellite observations maintain the performance of traditional 1DVar algorithms using a fraction of computer resources. In addition, we also show how feed forward networks, as well as convolutional and long/short term memory networks can, in theory and practice, be used to accurately correct systematic forecast errors, and predict the time evolution of geophysical parameters such as hurricane track/intensity and 3D winds from satellite and other geophysical inputs. Finally, we will show results of the data fusion between physical based forecast models, simulated and real satellite data, and in situ observations using machine learning techniques.

- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner