training course 2009 – nwp-pr: the seasonal forecast system at ecmwf 1 the seasonal forecast...
TRANSCRIPT
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 1
The Seasonal Forecast System
at ECMWF
Tim Stockdale European Centre for Medium-Range Weather Forecasts
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 2
Contents
• Seasonal forecasting with coupled GCMs Why use GCMs? How we make a forecast Basic calibration – how, why, and what are the problems? Model error
• Operational forecasts: ECMWF System 3 System design How good are the El Nino forecasts? How good are the atmospheric forecasts?
• Multi-model systems and forecast interpretation EUROSIP multi-model system Meaning of probabilistic forecasts
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 3
Sources of seasonal predictability
KNOWN TO BE IMPORTANT:o El Nino variability - biggest single signalo Other tropical ocean SST - important, but multifariouso Climate change - especially important in mid-latitudeso Local land surface conditions - e.g. soil moisture in spring
OTHER FACTORS:o Volcanic eruptions - definitely important for large eventso Mid-latitude ocean temperatures - still somewhat controversialo Remote soil moisture/ snow cover - not well establishedo Sea ice anomalies - local effects, but remote??o Dynamic memory of atmosphere - most likely on 1-2 monthso Stratospheric influences - various possibilities
Unknown or Unexpected - ???
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 4
Methods of seasonal forecasting
• Empirical forecastingo Use past observational record and statistical methods
o Works with reality instead of error-prone numerical models o Limited number of past cases means that it works best when observed variability is
dominated by a single source of predictability o A non-stationary climate is problematic
• Two-tier forecast systemso First predict SST anomalies (ENSO or global; dynamical or statistical)o Use ensemble of atmosphere GCMs to predict global responseo Many people still use regression of a predicted El Nino index on a local variable of
interest
• Single-tier GCM forecastso Include comprehensive range of sources of predictability o Predict joint evolution of SST and atmosphere flow o Includes indeterminacy of future SST, important for prob. forecasts o Model errors are an issue!
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 5
Step 1: Build a coupled model
• IFS (atmosphere) TL159L62 Cy31r1, 1.125 deg grid for physics (operational in Sep 2006) Full set of singular vectors from EPS system to perturb atmosphere initial
conditions (more sophisticated than needed …) Ocean currents coupled to atmosphere boundary layer calculations
• HOPE (ocean) Global ocean model, 1x1 mid-latitude resolution, 0.3 near equator A lot of work in developing the OI ocean analyses, including analysis of
salinity, multivariate bias corrections and use of altimetry.
• Coupling Fully coupled, no flux adjustments, except no physical model of sea-ice
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 6
Step 2: Make some forecasts
• Initialize coupled system (cf Magdalena’s lecture) Aim is to start system close to reality. Accurate SST is particularly
important, plus ocean sub-surface. Don’t worry too much about “imbalances”
• Run an ensemble forecast Explicitly generate an ensemble on the 1st of each month, with
perturbations to represent the uncertainty in the initial conditions; run forecasts for 7 months
• Worry about model error later ….
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 7
Creating the ensemble
• Wind perturbations Perfect wind would give a good ocean analysis, but uncertainties are
significant. We represent these by adding perturbations to the wind used in the ocean analysis system.
BUT only have 5 member ensemble, and no representation of other sources of uncertainty in ocean analysis (obs error, ..)
• SST perturbations SST uncertainty is not negligible SST perturbations added to each ensemble member at start of forecast. BUT perturbations based on analyses that use the same input data
• Atmospheric unpredictability Atmospheric ‘noise’ soon becomes the dominant source of spread in an
ensemble forecast. This sets a fundamental limit to forecast quality. To ensure that noise grows rapidly enough in the first few days, we
activate ‘stochastic physics’ and use EPS singular vectors.
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 8
Rms error of forecasts has been systematically reduced (solid lines) ….
RMSE and spread in different systems
0 1 2 3 4 5 6Forecast time (months)
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
NINO3.4 SST anomaly correlation
0 1 2 3 4 5 6Forecast time (months)
0
0.2
0.4
0.6
0.8
1
Rm
s e
rro
r (d
eg
C)
Ensemble sizes are 5 (0001), 5 (0001) and 5 (0001)192 start dates from 19870101 to 20021201
NINO3.4 SST rms errors
Fcast S3 Fcast S2 Fcast S1 Persistence
MAGICS 6.11 cressida - net Tue Apr 17 16:45:18 2007
.. but ensemble spread (dashed lines) is still substantially less than actual forecast error.
Substantial amounts of forecast error are not from the initial conditions.
0 1 2 3 4 5 6Forecast time (months)
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
NINO3.4 SST anomaly correlation
0 1 2 3 4 5 6Forecast time (months)
0
0.2
0.4
0.6
0.8
1
Rm
s e
rro
r (d
eg
C)
Ensemble sizes are 5 (0001), 5 (0001) and 5 (0001)192 start dates from 19870101 to 20021201
NINO3.4 SST rms errors
Fcast S3 Fcast S2 Fcast S1 Persistence Ensemble sd
MAGICS 6.11 cressida - net Tue Apr 17 16:41:30 2007
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 9
Step 3: Remove systematic errors
• Model drift is typically comparable to signal Both SST and atmosphere fields
• Forecasts are made relative to past model integrations Model climate estimated from 25 years of forecasts (1981-2005), all of
which use a 11 member ensemble. Thus the climate has 275 members. Model climate has both a mean and a distribution, allowing us to estimate
eg tercile boundaries. Model climate is a function of start date and forecast lead time. EXCEPTION: Nino SST indices are bias corrected to absolute values, and
anomalies are displayed wrt a 1971-2000 climate.
• Implicit assumption of linearity We implicitly assume that a shift in the model forecast relative to the
model climate corresponds to the expected shift in a true forecast relative to the true climate, despite differences between model and true climate.
Most of the time, assumption seems to work pretty well. But not always.
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 10
ECMWF Met Office
Météo-France CFS
Biases (eg JJA 2mT as shown here) are often comparable in magnitude to the anomalies which we seek to predict
Plots courtesy Paco Doblas-Reyes
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 11
SST bias is a function of lead time and season.
More recent systems have less bias, but it is still large enough to require correcting for.
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17Calendar month
23
24
25
26
27
28
29
Ab
solu
te S
ST
NINO3.4 mean absolute SST
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17Calendar month
-3
-2
-1
0
1
2
3
Dri
ft (
de
g C
)
NINO3.4 mean SST drift
Fcast S3 Fcast S2 Fcast S1
MAGICS 6.11 cressida - net Tue Apr 17 16:41:30 2007
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 12
Despite SST bias and other errors, anomalies in the coupled system can be remarkably similar to those obtained using observed (unbiased) SSTs …..
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 13
… and can also verify well against observations
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 14
Model errors are still serious …
• Models have errors other than mean bias Eg weak wind and SST variability in system 2 Strong underestimation of MJO activity Suspected too-weak teleconnections to mid-latitudes
• Mean state errors interact with model variability Nino 4 region is very sensitive (cold tongue/warm pool boundary) Atlantic variability suppressed if mean state is badly wrong
• Our forecast errors are larger than they should be With respect to internal variability estimates and (occasionally) other
prediction systems Reliability of probabilistic forecasts is not particularly high
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 15
ENSO forecast quality – Feb. starts only
1962 1964 1966 1968 1970 1972 1974 1976 1978 1980 1982 1984 1986 1988 1990 1992 1994 1996 1998 2000 2002 2004 2006 20080
0.5
1
1.5
Abs
olut
e E
rror
(de
g C
)
0
0.5
1
1.5
Ensemble size is 11 SST obs: HadISST1/OIv2ECMWF forecasts (mean during 3 months, plotted at centre of verification period)
NINO3.4 SST absolute error scores
BAE for S3 : 0.056MAE for S3 : 0.176
Obs. anom. MAE S3 BAE S3
MAGICS 6.11 cressida - net Wed Mar 26 11:03:30 2008
Completion of TAO array
ECMWF S3, 1 Feb Starts, first 3 months of forecast
0 1 2 3 4 5 6 7Forecast time (months)
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
NINO3.4 SST anomaly correlation
0 1 2 3 4 5 6 7Forecast time (months)
0
0.2
0.4
0.6
0.8
1
Rm
s e
rro
r (d
eg
C)
Ensemble size is 11 15 start dates from 19930201 to 20070201
NINO3.4 SST rms errors
Fcast S3 Persistence Ensemble sd
MAGICS 6.11 cressida - net Wed Mar 26 11:38:12 2008
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 16
ENSO forecast quality – all start months
1961 1963 1965 1967 1969 1971 1973 1975 1977 1979 1981 1983 1985 1987 1989 1991 1993 1995 1997 1999 2001 2003 20050
0.5
1
1.5
Abs
olut
e E
rror
(de
g C
)
0
0.5
1
1.5
Ensemble size is 11 SST obs: HadISST1/OIv2ECMWF forecasts (mean during 3 months, plotted at centre of verification period)
NINO3.4 SST absolute error scores
BAE for S3 : 0.101MAE for S3 : 0.246
Obs. anom. MAE S3 BAE S3
MAGICS 6.11 cressida - net Thu Apr 10 18:29:21 2008
Model error partially masks the benefit of improved observations
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 17
System 3 vs Cycle 33r1
The most recent ECMWF cycles (such as 33r1, shown here) develop a too-strong cold tongue in the West Pacific. This results in a large drop in skill in SST forecasts in Nino 4. This relationship between mean-state error and forecast performance in the Nino-4 region has been seen time and again in different model versions.
The model version used for System 3 has a pretty good balance, but even it is not perfect.
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18Calendar month
24
25
26
27
28
29
30
Ab
solu
te S
ST
NINO4 mean absolute SST
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18Calendar month
-2
-1
0
1
2
Dri
ft (
de
g C
)
NINO4 mean SST drift
Fcast f05p Fcast S3
MAGICS 6.11 cressida - net Thu Apr 10 18:36:23 2008
0 1 2 3 4 5 6 7Forecast time (months)
0.4
0.6
0.8
1
1.2
1.4
1.6
Am
plit
ud
e R
atio
NINO4 SST anomaly amplitude ratio
0 1 2 3 4 5 6 7Forecast time (months)
0
0.2
0.4
0.6
0.8
1
Me
an
sq
ua
re s
kill
sco
re
Ensemble sizes are 3 (f05p) and 3 (0001) 40 start dates from 19870501 to 20061101
NINO4 SST mean square skill scores
Fcast f05p Fcast S3 Persistence
MAGICS 6.11 cressida - net Thu Apr 10 18:36:22 2008
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 18
0 1 2 3 4 5 6 7Forecast time (months)
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
NINO4 SST anomaly correlation
0 1 2 3 4 5 6 7Forecast time (months)
0
0.2
0.4
0.6
0.8
Rm
s e
rro
r (d
eg
C)
Ensemble sizes are 11 (0001) and 11 (0001)300 start dates from 19810101 to 20051201
NINO4 SST rms errors
Fcast S3 Fcast S3 Persistence Ensemble sd
MAGICS 6.11 cressida - net Thu Feb 8 16:21:34 2007
Red: statistically corrected System 3 forecast
Improvement is mainly seen in Nino 4, where there are clear indications of anomaly / mean-state interactions, and where state-dependent errors are visible.
(Statistical model is a cross-validated linear regression, based on initial condition SST anomaly and model predicted SST anomaly)
Statistical post-processing
Not used operationally
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 19
1982 1984 1986 1988 1990 1992 1994 1996 1998 2000 2002 2004-2
-1
0
1
2
Ano
mal
y (d
eg C
)
-2
-1
0
1
2
Ensemble size is 11 SST obs: HadISST1/OIv2ECMWF forecasts, mean for months 1- 3, plotted at centre of verification period
NATL SST forecast anomalies
Obs. anom. Fcast S3
MAGICS 6.11 cressida - net Sat Sep 1 00:30:54 2007
1982 1984 1986 1988 1990 1992 1994 1996 1998 2000 2002 2004-2
-1
0
1
2
Ano
mal
y (d
eg C
)
-2
-1
0
1
2
Ensemble size is 11 SST obs: HadISST1/OIv2ECMWF forecasts, mean for months 4- 6, plotted at centre of verification period
NATL SST forecast anomalies
Obs. anom. Fcast S3
MAGICS 6.11 cressida - net Sat Sep 1 00:29:36 2007
Capturing trends is important. Some are handled well ….
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 20
1981 1982 1983 1984 1985 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004-4
0
4
Ano
mal
y (K
)
-4
0
4
Ensemble size is 11 T50 obs: ERA40/opsECMWF forecasts, mean for months 5- 7, plotted at centre of verification period
GLOBAL T50 forecast anomalies
Obs. anom. Fcast S3
MAGICS 6.11 cressida - net Thu Mar 27 09:28:59 2008
… other trends are poorly handled
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 21
Volcanic influence on Eurasian winters?
Ensemble size = 11, climate size = 275Forecast start reference is 01/11/82Mean 2m temperature anomalyECMWF Seasonal Forecast
Solid contour at 1% levelShaded areas significant at 10% level
DJF 1982/83System 3
75°S 75°S
60°S60°S
45°S 45°S
30°S30°S
15°S 15°S
0°0°
15°N 15°N
30°N30°N
45°N 45°N
60°N60°N
75°N 75°N
150°W
150°W 120°W
120°W 90°W
90°W 60°W
60°W 30°W
30°W 0°
0° 30°E
30°E 60°E
60°E 90°E
90°E 120°E
120°E 150°E
150°E
Produced from hindcast data
<-2.0°C -2.0..-1.0 -1.0..-0.5 -0.5..0 No Signal 0..0.5 0.5..1.0 1.0..2.0 > 2.0°C
Ensemble size = 11, climate size = 275Forecast start reference is 01/11/91Mean 2m temperature anomalyECMWF Seasonal Forecast
Solid contour at 1% levelShaded areas significant at 10% level
DJF 1991/92System 3
75°S 75°S
60°S60°S
45°S 45°S
30°S30°S
15°S 15°S
0°0°
15°N 15°N
30°N30°N
45°N 45°N
60°N60°N
75°N 75°N
150°W
150°W 120°W
120°W 90°W
90°W 60°W
60°W 30°W
30°W 0°
0° 30°E
30°E 60°E
60°E 90°E
90°E 120°E
120°E 150°E
150°E
Produced from hindcast data
<-2.0°C -2.0..-1.0 -1.0..-0.5 -0.5..0 No Signal 0..0.5 0.5..1.0 1.0..2.0 > 2.0°C
ECMWF System 3 hindcast
Ensemble size = 1, climate size = 25
Mean 2m temperature anomalyECMWF analysis
DJF 1982/83
75°S75°S
60°S 60°S
45°S45°S
30°S 30°S
15°S15°S
0° 0°
15°N15°N
30°N 30°N
45°N45°N
60°N 60°N
75°N75°N
150°W
150°W 120°W
120°W 90°W
90°W 60°W
60°W 30°W
30°W 0°
0° 30°E
30°E 60°E
60°E 90°E
90°E 120°E
120°E 150°E
150°E
Analysis
<-2.0°C -2.0..-1.0 -1.0..-0.5 -0.5..-0.2 -0.2..0.2 0.2..0.5 0.5..1.0 1.0..2.0 > 2.0°C
Ensemble size = 1, climate size = 25
Mean 2m temperature anomalyECMWF analysis
DJF 1991/92
75°S75°S
60°S 60°S
45°S45°S
30°S 30°S
15°S15°S
0° 0°
15°N15°N
30°N 30°N
45°N45°N
60°N 60°N
75°N75°N
150°W
150°W 120°W
120°W 90°W
90°W 60°W
60°W 30°W
30°W 0°
0° 30°E
30°E 60°E
60°E 90°E
90°E 120°E
120°E 150°E
150°E
Analysis
<-2.0°C -2.0..-1.0 -1.0..-0.5 -0.5..-0.2 -0.2..0.2 0.2..0.5 0.5..1.0 1.0..2.0 > 2.0°C
Analysed 2m T anomaly
DJF 82/83
DJF 91/92
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 22
Operational seasonal forecasts
• Real time forecasts since 1997 “System 1” initially made public as “experimental” in Dec 1997 System 2 started running in August 2001, released in early 2002 System 3 started running in Sept 2006, operational in March 2007
• Burst mode ensemble forecast Initial conditions are valid for 0Z on the 1st of a month Forecast is created typically on the 11th/12th (SST data is delayed up to 11
days) Forecast and product release date is 12Z on the 15th.
• Range of operational products Moderately extensive set of graphical products on web Raw data in MARS Formal dissemination of real time forecast data
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 23
System 3 configuration
• Real time forecasts: 41 member ensemble forecast to 7 months SST and atmos. perturbations added to each member
11 member ensemble forecast to 13 months Designed to give an ‘outlook’ for ENSO Only once per quarter (Feb, May, Aug and Nov starts) November starts are actually 14 months (to year end)
• Back integrations from 1981-2005 (25 years) 11 member ensemble every month 5 members to 13 months once per quarter
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 24
How many back integrations?
• Back integrations dominate total cost of system System 3: 3300 back integrations (must be in first year) 492 real-time integrations (per year)
• Back integrations define model climate Need both climate mean and the pdf, latter needs large sample May prefer to use a “recent” period (30 years? Or less??) System 2 had a 75 member “climate”, System 3 has 275. Sampling is basically OK
• Back integrations provide information on skill A forecast cannot be used unless we know (or assume) its level of skill Observations have only 1 member, so large ensembles are much less
helpful than large numbers of cases. Care needed eg to estimate skill of 41 member ensemble based on past
performance of 11 member ensemble For regions of high signal/noise, System 3 gives adequate skill estimates For regions of low signal/noise (eg <= 0.5), need hundreds of years
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 25
Example forecast products
• A few examples only – see web pages for full details and assessment of skill
• Note: Significance values on plots A lot of variability in seasonal mean values is due to chaos Ensembles are large enough to test whether any apparent signals are real
shifts in the model pdf We use the w-test, non-parametric, based on the rank distribution NOT related to past levels of skill
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 26
SEP2008
OCT NOV DEC JAN2009
FEB MAR APR MAY JUN JUL AUG SEP OCT NOV
-1
0
1
2A
nom
aly
(deg
C)
-1
0
1
2
Monthly mean anomalies relative to NCEP adjusted OIv2 1971-2000 climatologyECMWF forecast from 1 Mar 2009
NINO3.4 SST anomaly plume
Forecast issue date: 15 Mar 2009
System 3
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 27
S2008
O N D J2009
F M A M J J A S O N D J2010
F M A
-1
0
1
2A
nom
aly
(deg
C)
-1
0
1
2
Monthly mean anomalies relative to NCEP adjusted OIv2 1971-2000 climatologyECMWF forecast from 1 Feb 2009
NINO3.4 SST anomaly plume
Forecast issue date: 15 Feb 2009
System 3
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 28
Ensemble size = 41, climate size = 275Forecast start reference is 01/03/09Mean 2m temperature anomalyECMWF Seasonal Forecast
Solid contour at 1% levelShaded areas significant at 10% level
JJA 2009System 3
75°S 75°S
60°S60°S
45°S 45°S
30°S30°S
15°S 15°S
0°0°
15°N 15°N
30°N30°N
45°N 45°N
60°N60°N
75°N 75°N
150°W
150°W 120°W
120°W 90°W
90°W 60°W
60°W 30°W
30°W 0°
0° 30°E
30°E 60°E
60°E 90°E
90°E 120°E
120°E 150°E
150°E
Forecast issue date: 15/03/2009
<-2.0°C -2.0..-1.0 -1.0..-0.5 -0.5..0 No Signal 0..0.5 0.5..1.0 1.0..2.0 > 2.0°C
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 29
Ensemble size = 41, climate size = 275Forecast start reference is 01/03/09Prob(most likely category of 2m temperature)ECMWF Seasonal Forecast
No significance test appliedJJA 2009System 3
75°S 75°S
60°S60°S
45°S 45°S
30°S30°S
15°S 15°S
0°0°
15°N 15°N
30°N30°N
45°N 45°N
60°N60°N
75°N 75°N
150°W
150°W 120°W
120°W 90°W
90°W 60°W
60°W 30°W
30°W 0°
0° 30°E
30°E 60°E
60°E 90°E
90°E 120°E
120°E 150°E
150°E
Forecast issue date: 15/03/2009
<---- below lower tercile above upper tercile ---->70..100% 60..70% 50..60% 40..50% other 40..50% 50..60% 60..70% 70..100%
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 30
Other operational plots for JJA 2009
Ensemble size = 41, climate size = 275Forecast start reference is 01/03/09Prob(most likely category of precipitation)ECMWF Seasonal Forecast
No significance test appliedJJA 2009System 3
30°N 30°N
45°N45°N
60°N 60°N
30°W
30°W 0°
0° 30°E
30°E
Forecast issue date: 15/03/2009
<---- below lower tercile above upper tercile ---->70..100% 60..70% 50..60% 40..50% other 40..50% 50..60% 60..70% 70..100%
Ensemble size = 41, climate size = 275Forecast start reference is 01/03/09Mean Z500 anomalyECMWF Seasonal Forecast
Solid contour at 1% significance levelJJA 2009System 3
30°N
30°N30
°N
30°N
60°N
180°150°W
120°W
90°W
60°W
30°W 0° 30°E
60°E
90°E
120°E
150°E
Forecast issue date: 15/03/2009
<- 40m
- 40..- 20
- 20..- 10
- 10..- 5
- 5 .. 5
5 .. 10
10.. 20
20.. 40
> 40m
Ensemble size = 41, climate size = 275Forecast start reference is 01/03/09Prob (MSLP > median)ECMWF Seasonal Forecast
Solid contour at 1% significance levelJJA 2009System 3
75°S 75°S
60°S60°S
45°S 45°S
30°S30°S
15°S 15°S
0°0°
15°N 15°N
30°N30°N
45°N 45°N
60°N60°N
75°N 75°N
150°W
150°W 120°W
120°W 90°W
90°W 60°W
60°W 30°W
30°W 0°
0° 30°E
30°E 60°E
60°E 90°E
90°E 120°E
120°E 150°E
150°E
Forecast issue date: 15/03/2009
0..10% 10..20% 20..30% 30..40% 40..60% 60..70% 70..80% 80..90% 90..100%
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 31
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 32
Many other fields are available from the forecast system …
( … although these are not routinely plotted)
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 33
0 1 2 3 4 5 6 7Forecast time (months)
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
NINO3.4 SST anomaly correlation
0 1 2 3 4 5 6 7Forecast time (months)
0
0.2
0.4
0.6
0.8
1
Rm
s e
rro
r (d
eg
C)
Ensemble size is 11300 start dates from 19810101 to 20051201
NINO3.4 SST rms errors
Fcast S3 Persistence Ensemble sd
MAGICS 6.11 cressida - net Tue Feb 6 09:26:44 2007
SST forecast performance
Actual rms errors > model estimate of “perfect model” errors
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 34
More recent SST forecasts are better ....
0 1 2 3 4 5 6 7Forecast time (months)
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
NINO3.4 SST anomaly correlation
0 1 2 3 4 5 6 7Forecast time (months)
0
0.2
0.4
0.6
0.8
1
Rm
s e
rro
r (d
eg
C)
Ensemble size is 11156 start dates from 19810101 to 19931201
NINO3.4 SST rms errors
Fcast S3 Persistence Ensemble sd
MAGICS 6.12n cressida - net Mon Mar 9 11:58:09 2009
0 1 2 3 4 5 6 7Forecast time (months)
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
NINO3.4 SST anomaly correlation
0 1 2 3 4 5 6 7Forecast time (months)
0
0.2
0.4
0.6
0.8
1
Rm
s e
rro
r (d
eg
C)
Ensemble size is 11168 start dates from 19940101 to 20071201
NINO3.4 SST rms errors
Fcast S3 Persistence Ensemble sd
MAGICS 6.12n cressida - net Mon Mar 9 11:59:56 2009
1981-1993 1994-2007
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 35
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18Calendar month
24
25
26
27
28
29
30
Ab
solu
te S
ST
NINO3.4 mean absolute SST
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18Calendar month
-3
-2
-1
0
1
2
3
Dri
ft (
de
g C
)
NINO3.4 mean SST drift
Fcast S3
MAGICS 6.11 cressida - net Mon Feb 5 17:34:33 2007
JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC
Verification month
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
NINO3.4 SST anomaly correlation at 5 months
JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC
Verification month
0
0.2
0.4
0.6
0.8
1
1.2
Rm
s e
rro
r (d
eg
C)
Ensemble size is 11300 start dates from 19810101 to 20051201
NINO3.4 SST rms errors at 5 months
Fcast S3 Persistence Ensemble sd
MAGICS 6.11 cressida - net Tue Feb 6 09:26:44 2007
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 36
0 1 2 3 4 5 6 7 8 9 10 11 12 13Forecast time (months)
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
NINO3.4 SST anomaly correlation
0 1 2 3 4 5 6 7 8 9 10 11 12 13Forecast time (months)
0
0.2
0.4
0.6
0.8
1
Rm
s e
rro
r (d
eg
C)
Ensemble size is 5100 start dates from 19810201 to 20051101
NINO3.4 SST rms errors
Fc S3 /m3 Persistence Ensemble sd
MAGICS 6.11 cressida - net Tue Feb 6 16:15:59 2007
At longer leads, model spread starts to catch up
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 37
0 1 2 3 4 5 6 7Forecast time (months)
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
NINO1+2 SST anomaly correlation
0 1 2 3 4 5 6 7Forecast time (months)
0
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
Rm
s e
rro
r (d
eg
C)
Ensemble size is 11300 start dates from 19810101 to 20051201
NINO1+2 SST rms errors
Fcast S3 Persistence Ensemble sd
MAGICS 6.11 cressida - net Tue Feb 6 09:26:44 2007
JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC
Verification month
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
NINO1+2 SST anomaly correlation at 5 months
JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC
Verification month
0
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
Rm
s e
rro
r (d
eg
C)
Ensemble size is 11300 start dates from 19810101 to 20051201
NINO1+2 SST rms errors at 5 months
Fcast S3 Persistence Ensemble sd
MAGICS 6.11 cressida - net Tue Feb 6 09:26:44 2007
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18Calendar month
20
21
22
23
24
25
26
27
28
Ab
solu
te S
ST
NINO1+2 mean absolute SST
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18Calendar month
-4
-3
-2
-1
0
1
2
3
4
Dri
ft (
de
g C
)
NINO1+2 mean SST drift
Fcast S3
MAGICS 6.11 cressida - net Mon Feb 5 17:34:33 2007
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 38
0 1 2 3 4 5 6 7Forecast time (months)
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
ATL3 SST anomaly correlation
0 1 2 3 4 5 6 7Forecast time (months)
0
0.2
0.4
0.6
0.8
Rm
s e
rro
r (d
eg
C)
Ensemble size is 11300 start dates from 19810101 to 20051201
ATL3 SST rms errors
Fcast S3 Persistence Ensemble sd
MAGICS 6.11 cressida - net Tue Feb 6 09:26:44 2007
JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC
Verification month
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
ATL3 SST anomaly correlation at 5 months
JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC
Verification month
0
0.2
0.4
0.6
0.8
1
Rm
s e
rro
r (d
eg
C)
Ensemble size is 11300 start dates from 19810101 to 20051201
ATL3 SST rms errors at 5 months
Fcast S3 Persistence Ensemble sd
MAGICS 6.11 cressida - net Tue Feb 6 09:26:44 2007
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18Calendar month
24
25
26
27
28
29
30
Ab
solu
te S
ST
ATL3 mean absolute SST
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18Calendar month
-3
-2
-1
0
1
2
3
Dri
ft (
de
g C
)
ATL3 mean SST drift
Fcast S3
MAGICS 6.11 cressida - net Mon Feb 5 17:34:33 2007
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 39
0 1 2 3 4 5 6 7Forecast time (months)
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
IND2 SST anomaly correlation
0 1 2 3 4 5 6 7Forecast time (months)
0
0.2
0.4
0.6
Rm
s e
rro
r (d
eg
C)
Ensemble size is 11300 start dates from 19810101 to 20051201
IND2 SST rms errors
Fcast S3 Persistence Ensemble sd
MAGICS 6.11 cressida - net Tue Feb 6 09:26:44 2007
JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC
Verification month
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
An
om
aly
co
rre
latio
n
wrt NCEP adjusted OIv2 1971-2000 climatology
IND2 SST anomaly correlation at 5 months
JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC
Verification month
0
0.2
0.4
0.6
0.8
Rm
s e
rro
r (d
eg
C)
Ensemble size is 11300 start dates from 19810101 to 20051201
IND2 SST rms errors at 5 months
Fcast S3 Persistence Ensemble sd
MAGICS 6.11 cressida - net Tue Feb 6 09:26:44 2007
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18Calendar month
25
26
27
28
29
30
31
Ab
solu
te S
ST
IND2 mean absolute SST
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18Calendar month
-3
-2
-1
0
1
2
3
Dri
ft (
de
g C
)
IND2 mean SST drift
Fcast S3
MAGICS 6.11 cressida - net Tue Feb 6 09:26:45 2007
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 40
How good are the forecasts?
Hindcast period 1981-2003 with start in November and averaging period 2 to 4Near-surface temperatureAnomaly Correlation Coefficient for CodOecmfE0001S003M001 with 11 ensemble members
-1 -0.9 -0.8 -0.7 -0.6 -0.4 -0.2 0.2 0.4 0.6 0.7 0.8 0.9 1
Hindcast period 1981-2003 with start in November and averaging period 2 to 4Near-surface temperaturePerfect-model Anomaly Correlation Coefficient for CodOecmfE0001S003M001 with 11 ensemble members
-1 -0.9 -0.8 -0.7 -0.6 -0.4 -0.2 0.2 0.4 0.6 0.7 0.8 0.9 1
Temperature: actual forecasts Temperature: perfect model
Deterministic skill: DJF ACC
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 41
How good are the forecasts?
Precip: actual forecasts Precip: perfect model
Deterministic skill: DJF ACC
Hindcast period 1981-2003 with start in November and averaging period 2 to 4PrecipitationAnomaly Correlation Coefficient for CodOecmfE0001S003M001 with 11 ensemble members
-1 -0.9 -0.8 -0.7 -0.6 -0.4 -0.2 0.2 0.4 0.6 0.7 0.8 0.9 1
Hindcast period 1981-2003 with start in November and averaging period 2 to 4PrecipitationPerfect-model Anomaly Correlation Coefficient for CodOecmfE0001S003M001 with 11 ensemble members
-1 -0.9 -0.8 -0.7 -0.6 -0.4 -0.2 0.2 0.4 0.6 0.7 0.8 0.9 1
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 42
How good are the forecasts?
Tropical precip < lower tercile, JJA NH extratrop temp > upper tercile, DJF
Probabilistic skill: Reliability diagrams
Threshold estimated with a kernel method for the PDFHindcast period 1981-2003 with start in May and averaging period 2 to 4Precipitation anomalies below the lower tercile over tropical band (land and sea points)Reliability diagram for CodOecmfE0001S003M001 with 11 ensemble members
0.0
0.2
0.4
0.6
0.8
1.0
Obs
erve
d fr
eque
ncy
0.0 0.2 0.4 0.6 0.8 1.0Forecast probability
1.0000
0.1000
0.0100
0.0010
0.0001
Sha
rp/R
el/R
es
0.2 0.4 0.6 0.8 1.0Forecast probability
-0.025
-0.020
-0.015
-0.010
-0.005
0.000
0.005
0.010
0.015
0.020
0.025
Brie
r sc
ore
Sharp Rel Res BS
( 0.316, 0.296)ROC skill score: 0.306 ( 0.257, 0.351)Sharpness: 0.104 ( 0.098, 0.110)Resolution skill score: 0.073 ( 0.051, 0.097)Reliability skill score: 0.916 ( 0.894, 0.933)Brier skill score: -0.012 (-0.054, 0.027)Skill scores and 95% conf. intervals ( 1000 samples)
Threshold estimated with a kernel method for the PDFHindcast period 1981-2003 with start in November and averaging period 2 to 4Near-surface temperature anomalies above the upper tercile over Northern extratropics (land and sea points)Reliability diagram for CodOecmfE0001S003M001 with 11 ensemble members
0.0
0.2
0.4
0.6
0.8
1.0
Obs
erve
d fr
eque
ncy
0.0 0.2 0.4 0.6 0.8 1.0Forecast probability
1.0000
0.1000
0.0100
0.0010
0.0001
Sha
rp/R
el/R
es
0.2 0.4 0.6 0.8 1.0Forecast probability
-0.025
-0.020
-0.015
-0.010
-0.005
0.000
0.005
0.010
0.015
0.020
0.025
Brie
r sc
ore
Sharp Rel Res BS
( 0.283, 0.258)ROC skill score: 0.271 ( 0.198, 0.336)Sharpness: 0.077 ( 0.073, 0.081)Resolution skill score: 0.061 ( 0.034, 0.091)Reliability skill score: 0.964 ( 0.936, 0.980)Brier skill score: 0.024 (-0.028, 0.072)Skill scores and 95% conf. intervals ( 1000 samples)
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 43
How good are the forecasts?
Europe: Temp > upper tercile, DJF
Probabilistic skill: Reliability diagrams
Threshold estimated with a kernel method for the PDFHindcast period 1981-2003 with start in November and averaging period 2 to 4Near-surface temperature anomalies above the upper tercile over Europe (land and sea points)Reliability diagram for CodOecmfE0001S003M001 with 11 ensemble members
0.0
0.2
0.4
0.6
0.8
1.0
Obs
erve
d fr
eque
ncy
0.0 0.2 0.4 0.6 0.8 1.0Forecast probability
1.0000
0.1000
0.0100
0.0010
0.0001
Sha
rp/R
el/R
es
0.2 0.4 0.6 0.8 1.0Forecast probability
-0.025
-0.020
-0.015
-0.010
-0.005
0.000
0.005
0.010
0.015
0.020
0.025
Brie
r sc
ore
Sharp Rel Res BS
( 0.113, 0.041)ROC skill score: 0.077 (-0.076, 0.232)Sharpness: 0.073 ( 0.065, 0.080)Resolution skill score: 0.013 ( 0.003, 0.058)Reliability skill score: 0.895 ( 0.771, 0.953)Brier skill score: -0.092 (-0.217, 0.005)Skill scores and 95% conf. intervals ( 1000 samples)
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 44
EUROSIP multi-model ensemble
• Three models running at ECMWF: ECMWF – as described Met Office – HADCM3 model, Met Office ocean analyses Meteo-France – Meteo-France model, Mercator ocean analyses
• Unified system Real-time since mid-2005 All data in ECMWF operational archive Common operational schedule (products released at 12Z on 15th) Common products Coordinated development strategy (sort of …)
See “EUROSIP User Guide” on web for details, and also the recent ECMWF Newsletter article (Issue No. 118, Winter 2008/09)
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 45
ECMWF EUROSIP
Summer 2009 temperature forecast
(See Antje’s lecture for more on multi-model ensembles ...)
Unweighted meanForecast start reference is 01/03/09Prob(most likely category of 2m temperature)EUROSIP multi-model seasonal forecast
No significance test appliedJJA 2009
ECMWF/Met Office/Météo-France
75°S75°S
60°S 60°S
45°S45°S
30°S 30°S
15°S15°S
0° 0°
15°N15°N
30°N 30°N
45°N45°N
60°N 60°N
75°N75°N
150°W
150°W 120°W
120°W 90°W
90°W 60°W
60°W 30°W
30°W 0°
0° 30°E
30°E 60°E
60°E 90°E
90°E 120°E
120°E 150°E
150°E
Forecast issue date: 15/03/2009
<---- below lower tercile above upper tercile ---->70..100% 60..70% 50..60% 40..50% other 40..50% 50..60% 60..70% 70..100%
Ensemble size = 41, climate size = 275Forecast start reference is 01/03/09Prob(most likely category of 2m temperature)ECMWF Seasonal Forecast
No significance test appliedJJA 2009System 3
75°S 75°S
60°S60°S
45°S 45°S
30°S30°S
15°S 15°S
0°0°
15°N 15°N
30°N30°N
45°N 45°N
60°N60°N
75°N 75°N
150°W
150°W 120°W
120°W 90°W
90°W 60°W
60°W 30°W
30°W 0°
0° 30°E
30°E 60°E
60°E 90°E
90°E 120°E
120°E 150°E
150°E
Forecast issue date: 15/03/2009
<---- below lower tercile above upper tercile ---->70..100% 60..70% 50..60% 40..50% other 40..50% 50..60% 60..70% 70..100%
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 46
Model error and forecast interpretation
• Model error is large It dominates El Nino forecast errors Mean state and variability errors are very significant Errors cannot be easily fixed
• Products typically account for sampling error only Don’t take model probabilities as true probabilities
• Estimating forecast skill can be difficult In many cases, data is insufficient to produce sensible estimates Makes interpretation of forecasts tough
• In the end we need trustworthy models (Multi-model ensembles are small, and only partially span the space of
model errors)
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 47
Calibrated forecasts
• Frequentist or Bayesian Sampling uncertainties give rise to frequentist probabilities Model errors mean our forecast pdfs are wrong. We can try to
represent our lack of knowledge as “probabilities” in a Bayesian framework.
• Calibration of probabilistic forecasts Given the model forecast pdf, the past history of forecast pdfs, the past
history of observed outcomes, what is the best estimate of the true pdf?
A Bayesian analysis depends on specification of the prior. Mathematically, no single “right” answer.
We hope one day to implement a hierarchical Bayesian analysis, accounting for the major uncertainties and producing some general purpose calibrated probabilistic forecasts.
Handling trends and non-stationarities is an additional challenge
Training Course 2009 – NWP-PR: The Seasonal Forecast System at ECMWF 48
Some final comments
• Plenty of scope for improving model forecasts Nino SST forecasts still much worse than predictability limits Model errors still obvious in many cases Initial conditions probably OK in Pacific for some time, recently improved
elsewhere by ARGO
• Model output -> user forecast Calibration and presentation of forecast information Potential for multi-model ensembles
• Timescale for improvements Optimist: in 10 years, we’ll have much better models, pretty reliable
forecasts, confidence in our ability to handle climate variations Pessimist: in 10 years, modelling will still be a hard problem, and progress
will largely be down to improved calibration