16B.5 Automated Treefall Detection using Zero-Shot Deep Learning

Thursday, 1 February 2024: 5:30 PM
338 (The Baltimore Convention Center)
Elizabeth Tirone, CIMMS, Norman, OK; NSSL, Norman, OK; and M. A. Wagner, Z. Chen, D. Candela, E. Rasmussen, and M. C. Coniglio

To improve wind speed estimates from tornadoes in areas lacking structures, new approaches aim to investigate estimating wind speeds from damage to trees and other crops. Automated methods are necessary to detect treefall (and other tree-damage information) given high quantities of aerial imagery and instances of fallen trees. Most machine learning methods used for image segmentation are trained and tested using datasets that are hand-labeled to differentiate pixels that belong to a certain class and those that do not. This labeling process can be very tedious and time consuming. Motivated by the advancement of deep learning in image processing, we present our efforts towards investigating zero-shot deep learning that does not require any training datasets to segment features of interest. Specifically, the newly released Segment Anything Model (SAM) from Meta has shown promising performance in zero-shot deep learning. We explore the SAM application in treefall detection using orthomosaics from UAS and Structure from Motion from datasets collected during the Propagation, Evolution, and Rotation in Linear Storms (PERiLS) field campaign. We demonstrate that the SAM provides an avenue to create tornado damage datasets with minimal time and effort, which facilitates deep learning applications to tornado damage detection. Automated damage detection will help to increase efficiency of tornado damage assessments, provide reliable data needed to better estimate near surface wind speeds, and improve severe storm climatology, especially in rural areas.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner