J2.3 A Deep Neural Network Deriving Cloud Properties from Satellite Remote Sensing

Monday, 13 January 2020: 11:15 AM
157AB (Boston Convention and Exhibition Center)
thomas rink, University of Wisconsin - Madison, Madison, WI; and A. Wimmers

Machine learning represents a new programming paradigm wherein the rules defining the mapping from data to results are automatically learned rather than explicitly programmed via procedural or object oriented techniques. Deep learning is specific subset of machine learning comprising hierarchical layers of increasingly relevant representations learned from a very large number of examples of input and expected output. Advances in both mathematics, e.g. the backpropagation algorithm, and rapid advances in GPU computing technology, make training these models feasible with the required large datasets. The focus of this paper will outline a specific methodology using Deep Learning to create a predictor of cloud properties from input satellite remote sensed brightness temperature, reflectance, and other ancillary parameters. We employ TensorFlow, a powerful, open source, software library designed for machine learning. Our model is a Deep Neural Network employing blocks of residual layers that has been parallelized to run on multiple GPUs. Timing and accuracy comparisons will be shown, as well as comparisons to TensorFlow 2.0 which handles the parallelization under the hood. We employ TensorFlow’s Dataset pipelining to minimize the GPU/CPU idle time by prefetching/processing mini batches on the CPU while the previous batch training is occupying the GPU. We show how this is beneficial with training datasets that are simply too large to fully localize, while keeping the satellite data in a familiar format. Finally, an example of a deployed model will be shown.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner