american meteorological society 2007 annual meeting 1 harvey stern 23rd conference on interactive...

25
1 AMERICAN METEOROLOGICAL SOCIETY 2007 ANNUAL MEETING Harvey Stern 23rd Conference on Interactive Information and Processing Systems, San Antonio, Texas, USA 14-18 Jan., 2007 Increasing weather forecast accuracy by mechanically combining human and automated predictions using a knowledge based system.

Post on 19-Dec-2015

215 views

Category:

Documents


1 download

TRANSCRIPT

1

AMERICAN METEOROLOGICAL SOCIETY 2007 ANNUAL MEETING

Harvey Stern

23rd Conference on Interactive Information and Processing Systems, San Antonio, Texas, USA 14-18 Jan., 2007

Increasing weather forecast accuracy by mechanically combining human and

automated predictions using a knowledge based system.

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

2

Knowledge Based System

Over recent years, the author has been involved in the development of a knowledge based weather forecasting system.

Various components of the system may be used to generate:- worded weather forecasts for the general public, - terminal aerodrome forecasts (TAFs) for aviation interests,- marine forecasts for the boating fraternity, &- weather graphics for the media.

The knowledge based system generates these products by using a range of forecasting tools (‘smart tools’) to interpret NWP model output in terms of a range of weather parameters.

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

3

February to May 2005 Trial

The author conducted a 100-day trial of the performance of the knowledge based system, with twice-daily graphical forecasts being generated out to seven days in advance.

Human Forecasts

42.3%

Automated Forecasts

43.2%

Percentage VarianceExplained

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

4

Graphical Forecasts Generated

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

5

Poor Correlation

The human and automated forecasts are poorly correlated:

The overall percentage variance of human forecasts explained by the automated forecasts being only 45.9%.

However, there is the accepted mathematical concept that:

Two or more inaccurate but (at least partially) independent predictions of the same future events may be combined to yield predictions that are, on the average, more accurate than either of them taken individually.

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

6

February to May 2005 Trial

The process of combining the human (official) and automated forecasts was shown to have the potential to yield a set of predictions that is far more accurate (that is, 7.9% more accurate) than current official forecasts.

Human Forecasts

42.3%

Automated Forecasts

43.2%

Percentage VarianceExplained

50.2%

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

7

The Strategy

The strategy is:

To take judgmental (human) forecasts (derived with the benefit of knowledge of all available computer generated forecast guidance); and,

To input these forecasts into a system that incorporates a statistical process to mechanically combine the judgmental (human) forecasts and the computer generated forecast guidance;

Thereby immediately yielding a new set of forecasts.

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

8

Mechanically Integrating Forecasts

The remainder of this talk reports on the evaluation of the knowledge based system, now modified in order to mechanically combine human and computer-generated predictions.

The system's output was firstly evaluated over a new “real-time” trial (commencing August 2005) of 100 days duration.

The new trial revealed that forecasts generated by mechanically combining the predictions explained 7.7% additional variance of weather (rainfall amount, sensible weather, minimum temperature, and maximum temperature) over that explained by the human (official) forecasts.

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

9

Sensible Weather

The system’s sensible weather predictions arise from an algorithm that interprets its generated probability of precipitation (PoP) forecasts and synoptic type in terms of expected sensible weather.

Conversely, the implied human (official) PoP arises from an algorithm that interprets the human (official) sensible weather predictions in terms of PoP.

This approach is similar to what Scott and Proton (2004) refer to as the creation of “anchor grids” from which to generate additional grids within GFE/IFPS.

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

10

Sensible Weather

Specifically:

(1) The (official) worded précis is interpreted in terms of PoP and Amount of Precipitation; and,

(2) The generated PoP and Amount of Precipitation are interpreted in terms of a worded précis.

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

11

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

12

Sensible Weather

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

13

Minor Modifications

In the light of the results of the 100-day trial, a number of minor modifications were made to the system and the trial was then continued.

After 365 Day-1 to Day-7 forecasts, that is, 2555 individual predictions, the average lift in percentage variance of weather explained is 7.9% over that explained by the current official forecasts.

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

14

Coverage

Mt St Leonard.

The system generates forecasts for 56 localities in Central Victoria, and therefore potentially caters for some 4 million people (or 20% of

the Australian population).

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

15

Verification (after 365 Days)

Weather Element

Verification Parameter

Official Forecasts

Combined Forecasts

All Elements % Variance Explained 33.4 41.3

Rain or No Rain % Correct 70.1 76.8

SQRT(Rainfall Amount) % Variance Explained 18.4 23.5

... RMS Error (mm0.5) 1.05 0.97

... Forecast Variability (mm0.5) 0.65 0.44

Sensible Weather % Variance Explained 23.7 34.2

... Forecast Variability (%) 18.9 11.6

Min Temp % Variance Explained 41.5 47.7

... RMS Error (ºC) 2.39 2.27

... Forecast Variability (ºC) 1.36 1.17

Max Temp % Variance Explained 50.0 59.7

... RMS Error (ºC) 2.82 2.49

... Forecast Variability (ºC) 1.86 1.36

Thunder Critical Success Index (%) 17.9 21.6

… Probability of Detection (%) 20.6 34.1

… False Alarm Ratio (%) 42.9 62.9

Fog Critical Success Index (%) 15.5 17.8

… Probability of Detection (%) 19.9 27.3

… False Alarm Ratio (%) 58.9 66.1

Wind Speed % Variance Explained 47.5 54.3

Wind Direction % Correct Within Half-Octant 68.3 71.2

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

16

Verification

The Day-1 maximum temperature component of forecasts for those ten places for which official forecasts are issued were verified.

Expressed as an expected departure from Melbourne’s maximum temperature:

(a) The mean absolute error of the system’s forecasts was 0.971ºC;

(b) The mean absolute error of the official forecasts was 1.099ºC.

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

17

Percentage Overall Variance Explained

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

18

Why the success?

What these data suggested was that adopting a strategy of combining predictions has the potential to deliver a set of forecasts that explain as much as 7.9% more variance than that explained by forecasts currently issued officially.

In most circumstances, the combining strategy leaves the system’s forecasts almost identical to the official forecasts; whilst,

In those few circumstances when the combining strategy substantially changes the official forecasts, the system’s forecasts usually represent an improvement on the official forecasts.

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

19

Our “raison d’être”

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

20

A Potential Leap in Accuracy{MAX Temp Forecasts}

Combined forecasts

Official forecasts

Long-Term Trend

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

21

Competitive Advantage

The American Marketing Association (2006) notes that:

“A ‘competitive advantage’ exists when there is a match between the distinctive competences of a firm and the factors critical for success within the industry that permits the firm to outperform its competitors.

Advantages can be gained by having the lowest delivered costs and/or differentiation in terms of providing superior or unique performance on attributes that are important to customers."

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

22

Conclusion

Adopting a strategy of combining predictions has the potential to deliver a set of forecasts that explain as much as 7.9% more variance than that explained by forecasts currently issued officially.

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

23

PostScript

Day-8 Forecasts explain 9.9% of the variance;

Day-9 Forecasts explain 5.5% of the variance; and,

Day-10 Forecasts explain 2.1% of the variance.

- most of the skill is in temperature forecasting.

Extension to Day-10 (from August 2007):

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

24

Acknowledgements

To Stuart Coombs, for drawing attention to the skill displayed by the NOAA Global Forecasting System (used in the trial),

To Marco Berghege for his suggestion regarding the programming utilised herein,

To Neville Nicholls and Frank Woodcock, for their comments on combining forecasts,

To Noel Davidson, for his comments on the NWP output,

To Mark Williams, for discussions on preserving forecast consistency,

To Tom Haydon, for assistance with data extraction,

To Blair Trewin, Graham Mills, and Terry Hart, to my other Bureau of Meteorology colleagues, for their helpful discussions and feedback, and

To VRO TO(obs) staff for their feedback on operational implementation of the system.

23rd Conference on Interactive Information and Processing Systems {Harvey Stern}

25

The End

Thank You