an update on obs4mips earthsystemcog /projects/obs4mips
DESCRIPTION
An update on obs4MIPs www.earthsystemcog.org /projects/obs4mips. WDAC Observations for Model Evaluation Task Team P . Gleckler and D. Waliser (WDAC task team co-chairs) w ith substantial input from all other team members (especially K. Taylor, R. Ferraro and V. Eyring ). - PowerPoint PPT PresentationTRANSCRIPT
An update on obs4MIPswww.earthsystemcog.org/projects/obs4mips
• A very brief review and status report of CMIP (a WGCM activity)• Progress with obs4MIPs since WDAC2 • Critical infrastructure (standards, conventions and technologies)• obs4MIPs-CMIP6 meeting (last week at NASA HQ)• The WDAC task team and next steps . . .
WDAC Observations for Model Evaluation Task TeamP. Gleckler and D. Waliser (WDAC task team co-chairs)with substantial input from all other team members
(especially K. Taylor, R. Ferraro and V. Eyring)
…2014 2015 2016 2017 2018 2019 2020
Diagnostic, Evaluation and Characterization
Model Version 1
with standardized metrics & assessment
Model Version 2
Model Version 4
Model Version 3
CMIP DECK
CMIP6Endorsed
MIPs
MIP1
MIP2MIP3
MIP1
MIP4
MIP2
Future projection
runs
CMIP6 Timeline
Nominal Simulation Period of CMIP6
Scenario MIP studies , MIP matrix, pattern scaling, scenario pairs
Possible IPCC AR6
Formulate scenarios to be run by AOGCMs and ESMs
Finalize experiment design (WGCM)
Run and analyze scenario simulations from matrix
Community input on CMIP6 design
Forcing data: harmonization, emissions to concentrations
Preliminary ESM/AOGCM runs with new scenarios
Diagnostic, Evaluation and Characterization of Klima (DECK)
Many are contributing to obs4MIPs
Jet Propulsion LaboratoryCalifornia Institute of Technology
Obs4MIPs initiated by JPL (R. Ferraro, J. Teixeira and D. Waliser) and PCMDI (P. Gleckler, K. Taylor) with oversight and support provided by NASA (T. Lee) and U.S. DOE (R. Joseph)
Several dozen scientists (thus far especially in NASA and CFMIP-OBS) doing substantial work to contribute datasets
Encouragement at WDAC2 to “internationalize” with a task team (now in place)
Participation is broadening, and the hope is sponsorship will also
1. Use the CMIP5 simulation protocol (Taylor et al. 2009) as guideline for selecting observations. Matching variable required
2. Observations to be formatted/structured the same as CMIP Model output (e.g. NetCDF files, CF Convention)
3. Include a Technical Note for each variable describing observation and use for model evaluation (at graduate student level).
4. Hosted side by side on the ESGF with CMIP model output.
obs4MIPs: The 4 Commandments
Model Output
Variables Satellite RetrievalVariables
Target Quantities
Modelers ObservationExperts
AnalysisCommunity
Initial Target Community
obs4MIPs “Technical Note”
Content (5-8 pages)• Intent of the Document• Data Field Description• Data Origin• Validation and Uncertainty Estimate• Considerations for use in Model Evaluation• Instrument Overview• References• Revision History• Point of contact
obs4MIPs: Current Set of Observations
AIRS (≥ 300 hPa)Temperature profile
Specific humidity profile
MLS (< 300 hPa)Temperature profile
Specific humidity profile
QuikSCAT Ocean surface winds
TES Ozone profile
AMSR-E SST
ATSR (ARC/CMUG) SST
TOPEX/JASON SSH
CERES TOA & Surface radiation fluxes
TRMM Precipitation
GPCP Precipitation
MISR Aerosol Optical Depth
MODISCloud fraction
Aerosol Optical Depth
NSIDC Sea Ice (in progress)
Cloudsat Clouds
Calipso Clouds & Aerosols
ISCCP Clouds
Parasol Clouds & Aerosols
CFMIP-OBS Provided
ECMWFZonal Winds
Meridional WindsExpected to expand to other fields
and sources of reanalysis
ana4MIPs Provided Reanalysis
ARMBE/DOEClouds, Radiation, Meteorology, Land
Surface, etcDiscussions on protocols for in-situ holdings
underway
Initial in-situ Example
Critical Infrastructurefor CMIP, obs4MIPs and
related activities
obs4MIPs-CMIP6 Planning Meeting Apr 29 – May 1, 2014
Why data standards?
Standards facilitate discovery and use of data• MIP standardization has increased steadily over more than 2 decades• User community expanded from 100’s to 10,000
Standardization requires• Conventions and controlled vocabularies • Tools enforcing or facilitating conformance
Standardization enables:• ESG federated data archive• Uniform methods of reading and interpreting data• Automated methods and “smart” software to analyze data efficiently
obs4MIPs-CMIP6 Planning Meeting Apr 29 – May 1, 2014
What standards is obs4MIPs building on?
netCDF – an API for reading and writing certain types of HDF formatted data
(www.unidata.ucar.edu/software/netcdf/)
CF Conventions – providing for standardized description of data contained in a file
(cf-convention.github.io)
Data Reference Syntax (DRS) – defining vocabulary used in uniquely identifying MIP
datasets and specifying file and directory names
(cmip-pcmdi.llnl.gov/cmip5/output_req.html).
CMIP output requirements – specifying the data structure and metadata
requirements for CMIP (cmip-pcmdi.llnl.gov/cmip5/output_req.html)
CMIP “standard output” list (cmip-pcmdi.llnl.gov/cmip5/output_req.html)
Climate Model Output Rewriter (CMOR): used to produce CF-compliant netCDF files that fulfill requirements of standard model experiments. Much of the metadata written to the output files is defined in MIP-specific tables. CMOR relies on these tables to provide much of the needed metadata.
obs4MIPs-CMIP6 Planning Meeting Apr 29 – May 1, 2014
Earth System Grid Federation (ESGF)
The Earth System Grid Federation (ESGF) Peer-to-Peer (P2P) enterprise system is a collaboration that develops, deploys and maintains software infrastructure for the management, dissemination, and analysis of model output and observational data.
The US DOE has funded development of this system for over a decade
esgf.org
obs4MIPs-CMIP6 Planning Meeting Apr 29 – May 1, 2014
The WCRP supports adoption and extension of the CMIP standards for all its projects/activities
Working Group on Coupled Modelling (WGCM) mandates use of established infrastructure for all MIPs
• Standards and conventions• Earth System Grid Federation (ESGF)
The WGCM has established the WGCM Infrastructure Panel (WIP) to govern evolution of standards, including:
• CF metadata standards• Specifications beyond CF guaranteeing fully self-describing and easy-to-use
datasets (e.g., CMIP requirements for output)• Catalog and software interface standards ensuring remote access to data,
independent of local format (e.g., OPeNDAP, THREDDS)• Node management and data publication protocols• Defined dataset description schemes and controlled vocabularies (e.g., the DRS)• Standards governing model and experiment documentation (e.g., CIM)
obs4MIPs-CMIP6 Planning Meeting Apr 29 – May 1, 2014
Earth System Commodity Governance (CoG) This is NEW! earthsystemcog.org
CoG enables users to create project workspaces, connect projects, share information, and seamlessly link to tools for data archival.
CoG is integrated with the ESGF data.
Easy to tailor to project specific needs.
The primary interface to CMIP and related MIPs is likely to migrate to COG in the next few months
All obs4MIPs information is now hosted on the COG, including access to data
obs4MIPs-CMIP6 Planning Meeting Apr 29 – May 1, 2014
Immediate infrastructural needs of obs4MIPs
CMOR needs to be generalized to better handle observational data.• CMOR was developed to meet modeling needs• Some of the attributes written by CMOR don’t apply to observations (e.g.,
model name, experiment name)
Modifications and extensions are needed for the DRS (should be proposed by the WDAC obs4MIPs Task Team to the WIP)
A streamlined “recipe” for preparation/hosting obs4MIPs datasets
CoG further refined to meet obs4MIPs requirements.
Obs4MIPs / CMIP6 Planning meeting
• Invite only, held at NASA HQ (Washington DC), April 29– May 1
• ~ 55 attendees: a very diverse mix of data experts (mostly satellite), modelers, and agency representatives from NASA, DOE, NOAA, ESA, EUMETSAT, and JAXA
• Focused on identifying opportunities to improve the use of existing satellite datasets in CMIP (model evaluation and research)
Obs4MIPs / CMIP6 Planning meeting
Day 1
• Background presentations• Atmospheric Composition & Radiation• Atmospheric physics• Moderated discussion topics:
- CMIP6 forcing data sets- High frequency observations for CMIP6- High spatial resolution for CMIP6- Geostationary data?- Going beyond satellite data?
Day 2
• Broadening involvement: Agency views Perspectives, Reanalysis
• Terrestrial Water & Energy, Land Cover/use• Carbon cycle• Oceanography & Cryosphere• Moderated discussion topics:
- Satellite simulators- Reanalysis (relationship to ana4MIPs)
Day 3 (morning)
• Rapporteur summaries• General discussion of future directions• A brief post meeting gathering of the WDAC task team
obs4MIPs-CMIP6 Planning Meeting Apr 29 – May 1, 2014
Example topics
• Some specific recommendations (what we were after!)• More data sets – which ones, priority?• Higher frequency (strong interest in this)• Process & Model Development focus – how to?• Satellite Simulator/Observation Proxy priorities?• Relaxing the “model-equivalent” criteria – how far?• Better characterization of obs uncertainty needed• Optimizing connections to ana4MIPs/reanalysis• Use of averaging kernels – how far?• Geostationary priorities/guidance• Gridded In-Situ data sets• In-Situ – where to start, how far to go?
Great discussion, lots of thoughtful feedback & recommendations
WDAC & Task Team
obs4MIPs obs4MIPs
obs4MIPs
obs4MIPsobs4MIPs
obs4MIPs-CMIP6 Planning Meeting Apr 29 – May 1, 2014
• Should obs4MIPs continue is focus on “enabling” CMIP research, or should it strive to lead it?
• Should obs4MIPs filter datasets based on quality? What about “obsolete” datasets?
• How many different datasets of the same observable should be included? Do we want 10 different SST products?
• Should obs4MIPs expand beyond focusing on “near globally gridded data sets”, and if so, how quickly and how much?
The WDAC task team & WDAC needs to take a position on these issues
Meeting open forum: mixed views on several key issues
A range of possibilities for obs4MIPs guidelines/requirements :
Alignment with CMIP
Model Output(& Used for Evaluation)
No Requirement / No obvious
near-term use for
Evaluation
Indirect or via
complex processing
steps
Alignment via simulator
or operational
operator
Alignment via simple steps e.g.
ratio, column average
Strict Alignment
Process/Value demonstrated via peer-reviewed literature
Should we construct a “Maturity Matrix” or Model Evaluation Readiness Level (MERL)?
Evidence for Data Quality? None
Significant <-> No Influence from models
Demonstrated via robust cal/val activity
Subjective <-> Objective error estimates
Large <-> Small error/uncertainties
Demonstrated via N(?) uses in peer-reviewed literature
obs4MIPs-CMIP6 Planning Meeting Apr 29 – May 1, 2014
Team members: P. Gleckler (co-chair; PCMDI), D. Waliser (co-chair; JPL), S. Bony (IPSL), M. Boslovich (GSFC), H. Chepfer (IPSL), V. Eyring (DLR), R. Ferraro (JPL), R. Saunders (MOHC), J. Schulz (EUMETSAT), K. Taylor (PCMDI), J-N Thépaut, ECMWF
• Task team should oversee the connection between obs4MIPs and ana4MIPs, but not ana4MIPs itself
• Connecting with GEWEX assessments
• Review existing “Maturity Matrix” applications and consider building on them to create a Model Evaluation Readiness Level (MERL)?
• Considering ex-officio members from agencies contributing or planning to contribute (ESA, NASA, etc.)… liaisons with other MIPs may also be necessary
• Augmenting CMOR to better handle observations is urgently needed
• For now, webX sessions being considered on a ~ bimonthly basis
WDAC task team discussions (90min, post obs4MIPs meeting)
obs4MIPs-CMIP6 Planning Meeting Apr 29 – May 1, 2014
Some closing thoughts on how all this works
Standards and infrastructure fundamentally rely on community consensus• Must serve the needs of multiple projects• Must meet the the diverse set of needs of a broad spectrum of users• Requires substantial coordination • Demands community input and the oversight relying on the expertise of a
diversity of scientists and data specialists
The current system is fragile in that many of the components are funded by individual projects that could disappear, impairing viability of the entire infrastructure
Every funder is essential to supporting this collaborative effort where
• Leadership and recognition is shared across multiple projects• Integration of the “whole” imposes external requirements on the individual
components.