workshop on quantitative evaluation of downscaled c limate projections (august 12-16, 2013)

45
Workshop on Quantitative Evaluation of Downscaled Climate Projections (August 12-16, 2013) The National Climate Predictions and Projections Platform

Upload: hannah-molina

Post on 31-Dec-2015

30 views

Category:

Documents


0 download

DESCRIPTION

The National Climate Predictions and Projections Platform. Workshop on Quantitative Evaluation of Downscaled C limate Projections (August 12-16, 2013). Motivation: Practitioner’s Dilemma. - PowerPoint PPT Presentation

TRANSCRIPT

Workshop on Quantitative Evaluation of Downscaled

Climate Projections (August 12-16, 2013)

The National Climate Predictions and Projections Platform

Motivation: Practitioner’s Dilemma

• Practitioner’s dilemma - how to choose among many available sources of climate information for a given place and application?

Needs

• Objective evaluation of datasets prepared for use in planning for climate change.

• Provision of application-specific guidance to improve usability of climate-change projections in planning.

• Initiate a community based on standards to build and sustain practice of evaluation and informed use of climate-change projections.

When, Where, Who

• August 12-16, 2013• Boulder Colorado• Participants

– Use cases from sectoral working groups• Agricultural impacts• Ecological impacts• Water resources impacts• Human health impacts

– Datasets from downscaling working groups– NCPP Community, agency partners, program

sponsors, international observers, interested parties

Week at a GlanceMonday

12 August

Tuesday13 August

Wednesday14 August

Thursday15 August

Friday16 August

Days 1 and 2 – Evaluation Focus

Day 3 – Transition

Days 4 and 5 – Guidance Focus

Expected Outcomes

– Database for access to high-resolution datasets with standardized metadata of downscaling methods

– Demonstration of flexible, transparent climate index calculation service (Climate Translator v.0)

– First version of a standardized evaluation capability and infrastructure for high-resolution climate datasets, incl. application-oriented evaluations

– Description of a sustainable infrastructure for evaluation services

– Sector and problem-specific case studies within the NCPP environment

– First version of a comparative evaluation environment to develop translational and guidance information

– Identify value-added, remaining gaps and needs for further development of evaluation framework and metadata, incl. templates

DATA

EVALUATION

COMMUNITIES OF PRACTICE

Evaluation: Downscaling working groups

BCSD BCCA ARRM MACA

Statistical downscaling datasets

Hostetler data, RegCM2

NARCCAP data,

Dynamical downscaling datasets

Delta method

Baseline

GRIDDED OBSERVATIONAL DATA SETS

Guidance: Applications• Application Use Cases

– Identification of network of application specialists

– Define representative questions to focus the evaluations

– Representation of application needs: Scales, Indices, etc.

– Feedback on guidance and translational information needs

– Feedback on design / requirements of software environment for workshop

– Contribution to reports from workshop

Water resources

Ecological Impacts

Agriculture

Health impacts

About 75 participants

• Downscaling working groups– BCCA, BCSD, ARRM, NARCCAP, MACA, etc. teams –

approx. 20 people

• Sectoral working groups– Agricultural impacts, Ecological impacts, Water

resources impacts, Human health impacts – approx. 30 people

• NCPP Community – Executive Board, Climate Science Applications Team, Core & Tech Teams = approx. 18 people

• Program managers, reporters, international guests – about 5 people

Week in More Detail

Monday

Tuesday

Wednesday

Thursday

Friday

Days 1 and 2 – EVALUATION focus• Intercomparison of downscaling methods• Fine tuning the evaluation framework – what worked and what did

not work?• Interpretation of results and development of guidance for user

groups• Identification of gaps and needs for downscaled data for the

participating applications

Day 3 – TRANSITION: EVALUATION and GUIDANCE• Morning - Summary of the downscaling methods attributes and

evaluations results by sector and protocol• Afternoon - Start of sectoral applications groups work

Days 4 and 5 – GUIDANCE focus• Interpretation of results and guidance for user groups• Presentation of metadata descriptions and their usage• Presentation of indices provision - OCGIS• Identification of gaps and needs for downscaled data for application

needs• Identification of future steps

Below are categories of supplemental information

Days in More Detail

Proposed structure Monday8:30-9:00: Breakfast and Coffee/Tea

9-9:30: Logistics, Welcome and introductions, Technicalities (Catchy intro: RAL director? Head of NOAA over video?)

– Brief introductions of workshop members– Technical logistics: internet access, ESG node and COG environment? – Overview of workshop, and key objectives

9:30-10:30: Key-Note: Practitioners Dilemma: A call from the desperate world for helpbreak

11-12:30: Evaluation approach of NCPP– Framework presentation of evaluation of downscaled projections data, protocols, standards

… – Introduction of version 0: How were the evaluations done, tools, images, metadata/CIM,

potential plans forward (DIBBs structure), working groups and workshops, … community of practice

Lunch 12:30-2pm

2-3:30 pm: High-resolution data providers: observed and projected gridded information– What distinguishes your method and what were you trying to accomplish with this method?

(getting to value-added question)– Presentations from developers of downscaling methods and datasets

Break

4-5pm: Key discussion: Discussion of Directions of Downscaling

Proposed structure Tuesday8:30-9:00: Breakfast and Coffee/Tea

9-10:30: Results from Evaluations : Data Perspective– Evaluation and characteristics of the baseline data: Observed gridded data comparisons to

station data and inter-comparisons – short presentations – Evaluation of the characteristics of the downscaled projections data: Downscaled

projections evaluation – presentations and discussionbreak

11-12:30: continued

Lunch 12:30-2pm

2-3:30pm: Results from Evaluation: User Perspective– Short introduction of Applications needs– Case studies presentation and critique of evaluations

break

4-5pm: Key Discussion: Discussion of issues related to the framework- Next steps in fine tuning the evaluation framework – what worked and what did not work?

What else needs to be added? What needs to be changed? What does need to be done by the developers of downscaled data – what gaps are there in relation to applications?

Proposed structure Wednesday8:30-9:00: Breakfast and Coffee/Tea

9-10:00: Key Note: Downscaling for the World of Water (Maurer?)

10-10:30: Summary of first two days and future evaluation potential using Protocols 2 and 3

– Summary first two days– Perfect Model experiments and evaluations

• Presentation and discussions

– Process-based metrics and evaluations• Presentation and discussions

break

11-12:30pm: User Communities

Lunch 12:30-2pm

break

4-5pm: Key discussion:

Day 4 and 5

More Detail on Participants and Partnerships

Partnership through downscaling working group

• GFDL – Perfect model experiments – Keith Dixon, V. Balaji, A. Radhakrishnan

• Texas Tech Univeristy, SC CSC– Katharine Hayhoe - ARRM

• DOI USGS, Bureau of Reclamation, Santa Clara University, Scripps Institute, Climate Central, NOAA/NWS– E. Maurer, H. Hidalgo, D. Cayan, A. Wood - BCSD, BCCA

• University of Idaho– J. Abatzoglou - MACA

• DOI USGS, Oregon State University– S. Hostetler – RegCM2 - dynamically downscaled data

• NCAR – Linda Mearns – NARCCAP - dynamically downscaled data

Partnerships through sectoral working groups

• Health impacts– NOAA/NWS – David Green– NYC Dept of Health – Dr. Shao Lin– NCAR – Olga Wilhelmi– Columbia University – Patrick Kinney– Univeristy of Florida – Chris Uejio

• Agricultural impacts– AGMIP – USDA– NIDIS – SE RISA

• Ecological impacts– DOI USGS NC CSC

• Water resources impacts– Bureau of Reclamation– California ……

Partnership through infrastructure, metadata and standards development

• ES-DOC– IS-ENES, METAFOR project (CIM and CVs)

• NESII – CoG, OCGIS

• EU CHARMe project (metadata archive and search)• EU CORDEX (dynamical downscaling CV), NA

CORDEX (archive and metadata standardization)• ESGF (data and images archiving)• DOI-USGS (data access)• GLISA (translational information archiving)

More Details on Protocols and Metrics

Downscaling working groups

BCSD BCCA ARRM MACA

Statistical downscaling datasets

Hostetler data, RegCM2

NARCCAP data,

Dynamical downscaling datasets

Delta method

Baseline

Downscaling working groups

BCSD

BCCA

ARRM

MACA

Statistical downscaling datasets

Hostetler data, RegCM2

NARCCAP data,

Dynamical downscaling

datasets

Baseline

Delta Method

Types of protocols

Idealized scenarios

Comparison to synthetic data with known properties

Perfect model

Comparison to a high-resolution GCM; allows

evaluation of nonstationarity

Observational

Validation by comparison to observed data

Evaluation framework:Protocols and Metrics

Groups of metrics

Group 2Sets of metrics useful for specific sectoral

and impacts applications

Water resources

Ecological impacts

Human health

Agricultural impacts

Group 1A standard set of

metrics calculated for all methods describing

the statistical distribution and

temporal characteristics of the

downscaled data

Central tendency

Tails of distribution

Variability

Temporal characteristics

Group 3Sets of metrics used to

evaluate climate system processes and

phenomena

Southwest monsoon

Extreme precipitation processes

Atmospheric rivers

Other extreme events related processes

More detailed architectural diagrams

• Original Vision of NCPP Architecture• Commodity Governance (Cog) Earth

System Grid Federation (ESGF) Infrastructure to Support 2013 Workshop

• OpenClimateGIS Systems Figure

27

Federated data archival and accessESGF, THREDDS, data.gov platforms

Data at USGS, PCMDI, NASA, NOAA, …

Downscaling and data formatting services, visualization, faceted data search, bookmarkingOpenClimateGIS, LAS, ESGF search, USGS

tools, ENSEMBLES

Resource layer

Service layer

Support for inter-comparison projects and workflows representing solution patterns

Curator display, CoG

Interface layer Composition and display of guidance

documents and other text related to the use of climate data

climate.gov approaches

Search and semantic services associated with web content and other sources

Consiliate, Drupal database tools

Federated metadata collection and displayCurator tools, METAFOR, Kepler and other

workflow solutions

Information Interpretation

NCPP website, project workspaces for communities of practiceCoG for community connections

Not complete or final!

Original Vision of NCPP Architecture: Summer 2011

CoG ESGF Infrastructure to support 2013 Workshop

OpenClimateGIS Systems Figure

Design Considerations:Climate Translator V.0

Design Considerations: Climate Translator V.0

IndicesPredefined; Defined by

users

GeographyDefine Locality; GIS;

Web Mapping

EvaluationProtocols, Metrics,

Observations

Analysis & Synthesis of Information

Definitions, Sources, Metadata,Fact Sheets, Narratives, Guidance

Multiple Basic Data ArchivesUSGS GeoDataPortalEarth System Grid …

NCPP ArchitectureWhat Date Goes Where

Primary DataExisting downscaled datasets; Validation datasets (observations or hi-res model output)

Quantitative Evaluation computation of indices, if not already available; computation of metrics according to NCPP protocols;

ESGF (local)

Other OpenDAP (e.g. Geodata

Portal)

Local disk (may be at NOAA, NCAR, or at

scientist’s institution)

Run the Evaluation Code: NCL; Python (?)

Evaluation Data Bundles

Image Bundles(ESGF? Local Database?)

Downscaling Model Components

Downscaling Simulations and Ensembles

Experiments

Processor Component

Index/Metric CodeDownscaled Datasets

Evaluation Protocols Experiment (e.g. NCPP

Protocol 1)

Groups of Metrics

? Experiment ?

Products of QED

Code Repository linked to COG Environment (ideally)

CIM

doc

umen

ts

New Indices Datasets (?)

Location of objects are color coded Orange = COG or other NCPP database Gray = “Don’t know yet”

Expert analysis

Evaluation Data Bundles

Image Bundles(ESGF? Local Database?)

Text (structured case studies; other text)

Products of Workshop/working groups

Search and Compare

Further Visualization

Other Images (unstructured)

Translational/Interpretive CIM document?

COG Wiki and Linked Tools

CIM

COG Wiki and Linked Tools or GLISA-like CMS/database ???

Integrate with other NCPP translational info

Design Considerations

• These plots were to help define the computational environment to support Workshop 2013. (Read note sections of slides.)– Focus on evaluation of existing data products– Linking to protocols and metrics development of

capability to compare and describe gridded data systems

– Separate the output interface in types to facilitate development of services versus internal NCPP environment

Two Classes of Evaluation

Evaluation of Methodology

Evaluation of Data Products

• Important for Data Set Developers

• Informs uncertainty description and translation

• “Perfect Model” strategically central

• Important for End Users• Informs Data Set

Developers• Definable problem with our

resources• Fundamental descriptions

are of value and support NCPP’s mission

2013 WorkshopFocus - Evaluation of Data Products

• Quantified Description Environment (QDE)

• Focus on T and P, quantify differences in standard data sets.– Data set choice criteria– Meaningful Contribution

• Standard treatment across datasets

– Gridded

• What is in the literature?

Evaluation of Data Products

• Important for End Users• Informs Data Set

Developers• Definable problem with our

resources• Fundamental descriptions

are of value and support NCPP’s mission

Quantified Description Environment (QDE)

Calculation

Inpu

t

Out

put

QDE: InputIn

put

Station Data (observations)

Gridded DataObservationsModels

QDE: OutputO

utpu

t

Research Environment

Support of Services

End-User / Us & Not Us

Digital DataPrimary DataDerived Data

Non-Digital DataSoftwareDescriptions Structured Unstructured

Environments: Us & Not UsAnalysisCollaborativeEnd-user

2013 Workshop and NCPP

NCPP Strategy and Projects

• Workshop in 2013 is starts a progression of workshops that focus the overall evaluation activity and strategy of NCPP

NCPP Strategy and Projects

• Workshop in 2013– a focal point and an integration of all NCPP projects– start of a progression of workshops that focus the

overall evaluation activity and strategy of NCPP

Climate Indices

Downscaling Evaluation

NC CSC Downscaling Metadata

IntegrationWorkshop

2013NCPP Software Environment

Interagency Community

Workshop Goals

• Quantitative evaluation

• Infrastructure support

• Description, guidance and interpretation

• Informed decision-making

Principles and values

• Standardization • Reproducibility and

transparency • Comparability of

methods• Extensibility • Co-development

Contributions to NCPP development goals

I. Evaluation standards– Develop a suite of evaluation metrics for downscaled data– Design a common suite of tests to evaluate downscaling methods

II. Guidance documents– Produce  guidance on the  advantages and limitations of various downscaling

techniques for specific user applications based on the quantitative evaluations– Inform development of standard, vetted downscaled climate prediction and projection

products

III. Translation for users and development of metadata– Educate users in the evaluation and application of downscaled climate prediction and

projection products– Develop searchable structured metadata to describe downscaling methods as well as

their evaluations– Develop an initial platform for data discovery, exploration, and analysis that serves and

makes use of the translational information

IV. Cyber infrastructure – Develop information technology infrastructure to support community analysis and

provision of climate information.