Wednesday, 15 January 2020: 9:15 AM
156A (Boston Convention and Exhibition Center)
Deep learning, a branch of machine learning, can be used to detect, classify, and segment objects based on user-specific criteria. Storm damage assessments have applied deep neural networks (DNN) to extract damage information at the macroscale. Applying deep neural networks to Unpiloted Aerial System (UAS)-based imagery could provide more accurate information for damage detection and recovery by extracting microscale information such as damage to roofs, trees, and debris. This research examines how deep neural networks can be used to improve storm damage assessments at the microscale. We focus on tornado damage from the June 26, 2018 Eureka, KS tornado. We use UAS-based visible imagery to construct orthomosaics and digital elevation models using UAS Structure from Motion (SfM). We train and apply multiple deep neural networks to detect tornado damage features at the microscale - roof and structural damage, fallen trees, and debris – and classify this damage based on Enhanced Fujita Scale ratings. Our method (UAS-SfM-DNN) demonstrates the ability to detect storm damage at the microscale and classify tornado damage according to EF-scale. This approach would automate and standardize storm damage assessments and could improve disaster loss estimates for Federal Emergency Management Agency (FEMA), insurance agencies, and the affected public.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner