The system was developed for the small (227,000 sq km) southeast Australian State of Victoria. It was described as being capable of generating forecasts for public, aviation, marine and media interests, in languages other than English, and for more than 200 localities in Victoria - a breadth of output far greater than one could ever hope to produce utilising the current labour intensive systems.
A major benefit of a knowledge-based system is that it incorporates an extensive "bank" of forecaster experience. Ramage (1993) has proposed an "iterative" approach to "locking in" improvements in forecasting methodology. The system's skill increases as new knowledge is incorporated into its operation. Hence, progress is gradually made towards the realisation of Ramage's dream. The system is (therefore) not seen as "yet another" instrument of forecast guidance. Rather, its development is seen as a logical step along the path of having the computer replicate (and ultimately replace) various aspects of the manual side of the forecast process, by systematically "locking in" new knowledge.
The PILOT VERSION of the system was evaluated during November 2001 for the city of Melbourne using a skill score that combines all features of a forecast. The evaluation showed that, although superiority over climatology was achieved, the forecasts (on most measures) proved to be inferior to the official forecasts.
The system was then "scaled back", new knowledge added, and what has been termed VERSION 1 of the system was then evaluated over a 100-day trial, the results of which were presented to the 19th IIPS Conference (Stern, 2003) (http://www.weather-climate.com/internetforecasts.html). The deficiency evident with the PILOT VERSION appears to have been largely eliminated (especially for precipitation at day 1 and for temperature at days 1, 2, 3, 4 and 5).
On the basis of the results of the 100-day trial, further knowledge was added to the system and VERSION 2, which extends the outlook from 6 to 7 days, underwent a trial (http://www.weather-climate.com/internetforecasts2.html). VERSION 2 utilises:
· An ensemble-forecasting proxy to take into account the extent of uncertainty associated with the Numerical Weather Prediction (NWP) model output,
· Uses cyclonicity in deriving the Probability of Precipitation (PoP) and the Quantitative Precipitation Forecasts (QPFs), and
· Takes into account the sharp maximum temperature gradients associated with moderate ENE flow during summer.
The extent of uncertainty taken into account is "truer" than what would be achieved utilising conventional ensemble-forecasting techniques, the measure being derived directly from an array of actual forecasts. For example, regression analysis determines that Day 4 maximum temperature forecasts' departure from normal is reduced to 69.6% of that departure in order to minimise the Root Mean Square error, Day 4 minimum temperature forecasts' departure from normal is reduced to 57.8% of that departure, Day 4 QPFs' departure from normal is reduced to 25.8% of that departure, and Day 4 PoPs departure from normal is reduced to 48.4% of that departure.
In conventional ensemble forecasting, the measure is derived from an array of model output generated by imposing a random set of perturbations on the initial analysis. Conventional ensemble forecasting suffers from the disadvantage of the level of uncertainty in the initial analysis being unknown, whereas the uncertainty associated with a data base of actual forecasts is known precisely.
The results of the trial shows that the use of the ensemble forecasting proxy is the strategy to follow in order to achieve knowledge based forecasts that are superior to those presently issued officially.
It may be appropriate to ask, from a philosophical point of view, whether or not it may be premature, at this stage, to move towards computer replication of the manual forecast process. After all, the new National Digital Forecast Data Base (NDFD) of the U.S.A. National Weather Service (NWS) (Glahn & Ruth, 2003) allows for considerable manual involvement in its operation. A move to computer replication would result in a paradigm shift in the nature of the forecasting meteorologist's role. The role increasingly would become one of utilising sophisticated methodologies to analyse the output of the automated system, and implementing changes to it (consequent upon the analyses).
However, having the computer replicate various aspects of the manual forecast process, in order to make possible the production of a greatly increased number and variety of forecast products, is already happening. For example, the highly competitive environment that the New Zealand weather service finds itself has resulted in it moving down this pathway (Linton & Peters, 2003).
Furthermore, there is pressure in the U.S.A. to allow commercial operators to take over government's traditional role in the provision of weather services (excepting the delivery of urgent warnings to protect life and property). To illustrate, AccuWeather's 31 January 2003 Media Release "expressed regret that the (National Research Council) report did not recommend that the National Weather Service end its practice of issuing routine weather forecasts."
Regardless of philosophy and outside of the legislative framework, competitive pressures may determine the future as private operators (and, also the general public) realise that the new technologies allow for the development and implementation of forecasting systems capable of providing a breadth of output far greater than one could ever hope to produce utilising the current approaches. As Brooks (1995), wrote: "technology, which initially allowed humans to make routine weather forecasts, will soon close that avenue of human endeavour ... (and thereby permit) concentration on severe events."
Supplementary URL: http://www.weather-climate.com/internetforecasts2.html