Pielke et al. (2007) have recently produced an updated hurricane damage data set, extending the record both forward and backwards in time, now covering 1900-2005. The compound Poisson process is refitted to this data. While the basic structure of the stochastic model remains unchanged, a number of features are reexamined. Given the recent debate about the impact of global warming on hurricanes, the question of possible trends in the damage data is revisited. An increasing trend in the frequency of damaging hurricanes is now identified with borderline statistical significance (not statistically significant in Katz, 2002). Because a very few intense hurricanes cause a disproportionate amount of damage, the upper tail of the distribution of damage from individual storms is remodeled using the generalized Pareto distribution from extreme value theory instead of the lognormal. There is strong evidence that this damage distribution has a heavier tail than the lognormal, implying that the detection of any trends in the damage from individual storms will be difficult. The nature of the relationship of between ENSO events and hurricane damage is also reanalyzed, with the results suggesting that the connection to the frequency of hurricanes is robust but the one to the damage from individual storms is not necessarily.
REFERENCES
Katz, R. W., 2002: Stochastic modeling of hurricane damage. Journal of Applied Meteorology, V. 41, pp. 754-762.
Pielke, R. A., Jr. et al., 2007: Normalized hurricane damages in the United States: 1900-2005. Natural Hazards Review (in press).
Supplementary URL: