364496 Applications of Deep Learning to Enhance Environmental Sensing Capabilities of Mobile Devices and Other Image Sensors

Wednesday, 15 January 2020
David R. Callender, Creare LLC, Hanover, NH; and J. Bieszczad, M. Shapiro, and J. Milloy

More than 80% of adults in the U.S. own a smartphone. These devices are always on, near their owners, connected to the internet and include a camera. Furthermore, smartphones are just one source of image sensors which have become ubiquitous. As such, images represent a potentially rich source of real-time weather and environmental data. Indeed, social media is already used by weather reporting and news agencies as a source of text and image-based weather observations. However, searching for and extracting this information is labor intensive and relevant observations represent a tiny fraction of what is possible with so many mobile devices and publicly available image and video feeds.

To address this gap, we have been developing image-based deep learning algorithms to automatically extract environmental observations from images, including identification of cloud cover and weather specific labels (e.g., snow, rain, fog), and have integrated them into a data collection platform for crowd sourced weather observations (https://weathercitizen.org).

In addition to identifying weather labels and cloud cover, we have been developing sea state prediction algorithms for image-based wave characterization. A major challenge with deep learning is acquiring sufficient quantities of labeled images. To this end, we have compiled a collection of more than 1 million geospatially located images spanning more than two years sourced from ocean buoys. This curated collection, which we plan to release, includes correlated environmental sensor data including spectral wave data, wind measurements, temperature, pressure, and dew point. In addition to presenting our deep learning sea state characterization algorithms, we will present this dataset and discuss potential future directions.

- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner