Normalized error scores provide comparisons of the accuracy of forecast quantities. This includes tropical system tracks and intensity errors. The normalized absolute error score represents the percentage of the time a value is within a critical value, Rc. As Rc increases, the skill score asymtopically approaches 1.
When discussing the relative skill of tropical system forecast models, forecasters and scientist have often published results comparing absolute track and intensity errors. These comparisons relate the various models and their averages over the past several years. Often, official forecasts are compared to these model results and to climatology and persistance and either are then compared by error or by percentage improvement. While this is useful to compare ongoing results and relative performance to previous cases, this doesn't provide information directly relative to skill. For example, how often does forecast model X accurately predict a hurricane within 150 km of its actual track? Also, the errors are calculated for a specific time and often increase as a function of time. It is useful to combine this information with the absolute error of a quantity to determine a percentage that can be assigned statistically to a model's perfomance.
In order to accomplish these goals listed above, we propose a normalized error score that represents the percentage of time that a forecast quantity is within a prescribed critical range. This can be computed for an interval of time and subdivided into equal increments and summed together for the period. Since most tropical system "best track" results are published in 6 hour intervals, this allows for up to 4 intervals per daily period to be computed from an absolute error quantity. For a daily error score, the score is calculated over the 4 6 hour intervals and the average of the scores represents the score for that 24 hour period.
Results for our 1997 and 1998 FSU Atlantic forecasts will be presented and compared to the same forecast periods using the NCEP and NHC model forecast products