visualizing uncertainty in mesoscale meteorology

33
Visualizing Uncertainty in Mesoscale Meteorology APL Verification Methodology 21 May 2002 Scott Sandgathe

Upload: jesus

Post on 06-Jan-2016

31 views

Category:

Documents


1 download

DESCRIPTION

Visualizing Uncertainty in Mesoscale Meteorology. APL Verification Methodology 21 May 2002 Scott Sandgathe. A New Paradigm for Weather. Forecasting. Automated Meteorological. Human. Information. Decision Driven. Forecaster. Evaluation System. Auto-Product. Decision. Generation. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Visualizing Uncertainty in Mesoscale Meteorology

Visualizing Uncertaintyin

Mesoscale Meteorology

APL Verification Methodology21 May 2002

Scott Sandgathe

Page 2: Visualizing Uncertainty in Mesoscale Meteorology

HumanForecasterDecisionInterface

Decision DrivenAuto-ProductGeneration

Automated MeteorologicalInformation

Evaluation System

A New Paradigm for WeatherForecasting

MURI/UW2/13/02

Decision Interface is comprised of:

Verification ofGlobal and

Meso Modelspast forecasts

Evaluation ofCurrent Met.

Analyses

Rule-based

evaluation ofmodel

predictions

Page 3: Visualizing Uncertainty in Mesoscale Meteorology

NOGAPS

centroid

ETA OutputEval.

MM5

MM5

MM5

evaluation

evaluation

evaluation

Wt.

Wt.

Wt.

EnsembleProducts

Global and Regional RMSE Evaluation – Recent, Historical and Synoptic

Evaluation of Ensemble members and combinations – recent, historical and synoptic

Evaluate current analyses based on observations, satellite and radar

Reform ensemble or chose “most-representative” member based on user evaluation

Products are automatically generated based on user-selected ensemble or member.

Verification and Evaluation

Page 4: Visualizing Uncertainty in Mesoscale Meteorology

Global RMS Error

Analysis AnalysisAnalysis Analysis Analysis

OBS

Forecast72-120hr

Forecast72-120 hr

Forecast72-120 hr

Forecast6 hr

Forecast6 hr

OBS OBS OBS OBS

00Z 06Z 12Z 18Z 00Z . . .

Page 5: Visualizing Uncertainty in Mesoscale Meteorology

Regional RMS Error

Global models are often “tuned” to the region of national interests or to the predominate natl weather pattern, and global skill may not reflect regional skill.

The region between 110E and 110W accounts for the majority of 0 – 48h weather.

Page 6: Visualizing Uncertainty in Mesoscale Meteorology

Mesoscale Verification

O O

OO

F

F

F

F

All equivalent?! POD=0, FAR=1 (Brown, 2002)

Page 7: Visualizing Uncertainty in Mesoscale Meteorology

Mesoscale Verification

O

O F

F

(After Brown, 2002)

OPOD >> 0, FAR < 1

Is this a better forecast?

Or is this?

Page 8: Visualizing Uncertainty in Mesoscale Meteorology

Mesoscale Verification

Page 9: Visualizing Uncertainty in Mesoscale Meteorology

Mesoscale Verification

• Total Error = Displacement Error + Amplitude Error + Residual Error

– MSE and CCF give equivalent results

– Hoffman, et. al., 1995, for satellite data assimilation.

• MSE(tot) = MSE(disp) + MSE(amp) + MSE(pattern)– Elbert and McBride, 2000, for precipitation pattern verif.

• Implementation (Du and Mullen, 2000):– Calculate MSE(tot) = (Forecast – Analysis)2

– Shift forecast field to minimize total MSE and calculate MSE(disp) = MSE(tot)-MSE(shift)

– Adjust amplitude to minimize MSE(shift). MSE(amp) = MSE(shift) – MSE(shift-min)

– MSE(residual) = MSE(tot) – MSE(disp) – MSE(amp)

• MSE(res) =? MSE(pattern) =? MSE(rotation)

Page 10: Visualizing Uncertainty in Mesoscale Meteorology

Phase Shift

Page 11: Visualizing Uncertainty in Mesoscale Meteorology

Phase and Amplitude Error

Page 12: Visualizing Uncertainty in Mesoscale Meteorology

Rotational Error

Page 13: Visualizing Uncertainty in Mesoscale Meteorology

Total Error

Page 14: Visualizing Uncertainty in Mesoscale Meteorology

Future Research Issues• Need to test on “real” data.• Many computational solutions:

– Correlation Coef., Mean absolute difference, etc.– Rapid ‘image motion’ search techniques

• Map verification or “feature” verification:– Phase and amplitude suitable for both– Rotation requires “feature” and more complex search

• Need to examine usefulness

• Evaluation of “goodness”– Relative weight of phase vs amplitude vs rotational err– Will test “table” approach often seen in software or

“service” evaluation.

Page 15: Visualizing Uncertainty in Mesoscale Meteorology

Questions and Comments?

Page 16: Visualizing Uncertainty in Mesoscale Meteorology

References

• Hoffman, R. N., Z. Liu, J.-F. Louis, and C. Grassotti, 1995: Distortion representation of forecast errors. Mon. Wea. Rev., 123, 2758-2770.

• Brown, B., 2002: Development of an Object-based Diagnostic Approach for QPF Verification. USWRP Science Symposium, April 2002.

• Ebert, E. E., and J. L. McBride, 2000: Verification of precipitation in weather systems: determination of systematic errors. J. Hydro., 239, 179-202.

• Du, J., and S. L. Mullen, 2000: Removal of Distortion Error from an Ensemble Forecast. Mon. Wea. Rev., 128, 3347-3351.

• Chan, E., 1993: Review of Block Matching Based Motion Estimation Algorithms for Video Compression. CCECE/CCGEI.

• Lim, D.-K., and Y.-S. Ho, 1998: A Fast Block Matching Motion Estimation Algorithm based on Statistical Properties of Object Displacement. IEEE.

Page 17: Visualizing Uncertainty in Mesoscale Meteorology

BACKUP SLIDES

• SLIDES FROM 31 Jan 2002 Meeting

Page 18: Visualizing Uncertainty in Mesoscale Meteorology
Page 19: Visualizing Uncertainty in Mesoscale Meteorology

36km Ensemble Mean and Selected Members SLP, 1000-500mb Thickness 2002 Jan 2200Z

Page 20: Visualizing Uncertainty in Mesoscale Meteorology

12km Ensemble Mean and Selected Members SLP, Temperature, Wind 2002 Jan 2200Z

Page 21: Visualizing Uncertainty in Mesoscale Meteorology

Verification of Mesoscale Features in NWP Models

Baldwin, Lakshmivarahan, and Klein

9th Conf. On Mesoscale Processes, 2001

Page 22: Visualizing Uncertainty in Mesoscale Meteorology

Tracking of global ridge-trough patterns from Tribbia, Gilmour and Baumhaufner

Page 23: Visualizing Uncertainty in Mesoscale Meteorology

Current global forecast and climate models produce ridge-trough transitions; however, the frequency of predicted occurrence is much less than the frequency of actual occurrence

Page 24: Visualizing Uncertainty in Mesoscale Meteorology

Creating Concensus From Selected Ensemble Members

- Carr and Elsberry

Page 25: Visualizing Uncertainty in Mesoscale Meteorology

Necessary Actions for Improved Dynamical Track Prediction

No forecaster reasoning possible.Help needed from modelers and data sourcesto improve prediction accuracy

Recognize erroneous guidance group or outlier,and formulate SCON that improves on NCON

No forecaster reasoning required -- use thenon-selective consensus (NCON)

Recognize situation as having inherently low predictability; must detect error mechanisms inboth outliers to avoid making SCON>>NCON

(48 h)SmallSpread(229 n mi)

LargeError

LargeSpread(806 n mi)

LargeError

LargeSpread(406 n mi)

SmallError

SmallSpread(59 n mi)

SmallError

Page 26: Visualizing Uncertainty in Mesoscale Meteorology

References

Cannon, A. J., P.H. Whitfield, and E.R. Lord, 2002: Automated, supervised synoptic map-pattern classification using recursive partitioning trees. AMS Symposium on Observations, Data Assimilation, and Probabilistic Prediction, pJ103-J109.

Carr. L.E. III, R.L. Elsberry, and M.A. Boothe, 1997: Condensed and updated version of the systematic approach meteorological knowledge base – Western North Pacific. NPS-MR-98-002, pp169.

Ebert, E.E., 2001: Ability of a poor man’s ensemble to predict the probability and distribution of precipitation. Mon. Wea. Rev., 129, 2461-2480.

Gilmour, I., L.A. Smith, R. Buizza, 2001: Is 24 hours a long time in synoptic weather forecasting. J. Atmos. Sci., 58, -.

Grumm, R. and R. Hart, 2002: Effective use of regional ensemble data. AMS Symposium on Observations, Data Assimilation, and Probabilistic Prediction, pJ155-J159.

Marzban, C., 1998: Scalar measures of performance in rare-event situations. Wea. and Forecasting, 13, 753-763.

Page 27: Visualizing Uncertainty in Mesoscale Meteorology

Data Data DataData Data

JMV HTML Satellite ViewerTAF

Data

Human

Product

ProductNeed

User

Current Forecast Paradigm

Page 28: Visualizing Uncertainty in Mesoscale Meteorology

NOGAPS

centroid

ETA OutputEval.

MM5

MM5

MM5

evaluation

evaluation

evaluation

Wt.

Wt.

Wt.

EnsembleProducts

J2EE Control/Interface Bean

J2EE Control/Interface Bean

J2EE Control/Interface Bean

J2EE Control/Interface Bean

Java Server Pages for each Bean interface

Control and Server-sidecomponents

Stat. tools

Meteorologytools

IMS & Viz.Tools (XIS)

Server Protocols (HTTP, RMI, CORBA..)

Page 29: Visualizing Uncertainty in Mesoscale Meteorology

Forecaster-in-the-Loop Concept

Verification(Past)

Evaluation(Present)

Product GenerationForecast(Future)

Time evolution

Observations Real time obs(Satellite)

User

Page 30: Visualizing Uncertainty in Mesoscale Meteorology

Verification(Past)

Evaluation(Present)

Product GenerationForecast(Future)

Time evolution

Observations Real time obs(Satellite)

User

A New Paradigm (Bob’S)

Page 31: Visualizing Uncertainty in Mesoscale Meteorology

Global Input

Ensemble Generation

Construct ModelFcst/Weighting

Evaluation

Model Guidance

Product Generation

The Ensemble Paradigm ofForecasting

From Cliff Mass’ MURIpresentation 1/31/02

Page 32: Visualizing Uncertainty in Mesoscale Meteorology
Page 33: Visualizing Uncertainty in Mesoscale Meteorology

Global Model InputsEnsemble Generation

System

Guidance ProductGeneration

(Visualization)

Verification System

Feedback

Data Share Archive

Statistics EnsembleGeneration Guidance

Decision Interface