In a new paper that is published in Natural Hazards, we applied multiple algorithms to model flood susceptibility in New Orleans. Please find the abstract below:
"Machine learning (ML) models, particularly decision tree (DT)-based algorithms, are being increasingly utilized for flood susceptibility mapping. To evaluate the advantages of DT-based ML models over traditional statistical models on flood susceptibility assessment, a comparative study is needed to systematically compare the performances of DT- based ML models with that of traditional statistical models. New Orleans, which has a long history of flooding and is highly susceptible to flooding, is selected as the test bed. The primary purpose of this study is to compare the performance of multiple DT-based ML models namely DT, Adaptive Boosting (AdaBoost), Gradient Boosting (GdBoost), Extreme Gradient Boosting (XGBoost) and Random Forest (RF) models with a traditional statistical model known as Frequency Ratio (FR) model in New Orleans. This study also aims to identify the main drivers contributing to flooding in New Orleans using the best performing model. Based on the most recent Hurricane Ida-induced flood inventory map and nine crucial flood conditioning factors, the models’ accuracies are tested and compared using multiple evaluation metrics. The findings of this study indicate that all DT-based ML models perform better compared to FR. The RF model emerges as the best model (AUC = 0.85) among all DT-based ML models in every evaluation metrics. This study then adopts the RF model to simulate flood susceptibility map (FSM) of New Orleans and compares it with the prediction of FR model. The RF model also demonstrates that low elevation and higher precipitation are the main factors responsible for flooding in New Orleans. Therefore, this comparative approach offers a significant understanding about the advantages of advanced ML models over traditional statistical models in local flood susceptibility assessment."
0 Comments
Leave a Reply. |
|