16B.6 Flood Detection and Mapping Through Sensor Fusion: A Comparative Study of Multi-Sensor Data Integration with UAV Optical Imagery

Thursday, 1 February 2024: 5:45 PM
338 (The Baltimore Convention Center)
Mulham Fawakherji, north carolina agricultural and technical state university, Greensboro, NC; and L. Hashemi Beni

Accurate and timely mapping of floodwater extent is crucial for supporting emergency managers and first responders in ensuring public safety. To address this challenge, sensor fusion emerges as a vital approach to enhance flood detection and mapping capabilities. The significance of data fusion in flood mapping lies in its ability to capitalize on the unique strengths of each sensor. Synthetic Aperture Radar (SAR) data, acquired from Sentinel-1, provides all-weather imaging capabilities, allowing the detection of inundated vegetation even in challenging conditions. On the other hand, the RGB imagery from UAVs offers high-resolution information about the flood area, providing valuable insights for flood assessment.

In this research, machine learning-based data fusion techniques are compared for precise land cover classification aimed at facilitating flood mapping. The proposed methodology starts with preprocessing steps, specifically focused on filtering and resampling processes. This step is vital for enhancing the quality of the input data. Subsequently, a machine learning-based classifier is employed to classify the input into four distinct classes: Open Water, Inundated Vegetation, Dry Vegetation, and Others any elements not falling into the previous classes.). Following the classification phase, a post-processing stage is executed to eliminate noise from the output. This refinement step contributes to ensuring the reliability of the results.

Our research strategy involved utilizing the benefits of the Google Earth Engine (GEE) as a cloud computing platform to access and employ a range of supervised machine learning algorithms including Support Vector Machine (SVM), Random Forest Classifier, Gradient tree Boosting (GTB), and Naive Bayes (NB). GEE allowed us to efficiently process geospatial datasets, including Synthetic Aperture Radar (SAR) data and RGB imagery from UAVs. The aim was to assess the performance of the fusion techniques along with Gee’s available classifiers in flood mapping and to determine the most effective algorithm for accurately detecting inundated areas.

More than 50,000 data points were collected from two study areas in North Carolina, USA, or algorithm development and validation. The selected regions experienced varying degrees of flooding caused by Hurricane Florence which occurred on September 14th, 2018.

In summary, our comprehensive evaluation encompassed both the performance of various classifiers and the impact of different data fusion techniques. These assessments collectively revealed consistent trends, highlighting the efficacy of the SAR & RGB fusion approach, alongside the Random Forest (RF) classifier for flood mapping. Across distinct land cover classes, the SAR & RGB fusion consistently demonstrated elevated accuracy rates in contrast to alternative methods. This robust pattern underscores the fusion approach's effectiveness in accurately classifying land cover categories.

Notably, in the context of classifying Under Vegetation Water, the UAV-based fusion outperformed both standalone SAR and the combination of SAR with Sentinel-2 RGB. The UAV-based fusion exhibited an impressive accuracy improvement of 11% and 1.4%, respectively, compared to these alternatives. Similarly, within the realm of Open Surface Water classification, the UAV-based fusion triumphed over other methodologies, showcasing notable accuracy advantages of 14.6% and 14.3%, respectively.

These outcomes emphasize the vital role of data fusion in improving classification accuracy, particularly the effective combination of SAR and RGB data guided by the RF classifier and underscores the value of using multiresolution and multimodal SAR and UAV RGB data fusion for rapid flood management and disaster response.

- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner