1 / 22

CMIP5 / HI-RES AMIP, and an update on WGNE/WGCM Performance Metrics Panel

CMIP5 / HI-RES AMIP, and an update on WGNE/WGCM Performance Metrics Panel. Peter J. Gleckler Program for Climate Model Diagnosis and Intercomparison (PCMDI) Lawrence Livermore National Laboratory Presented to 26th Session of the Working Group on Numerical Experimentation Tokyo, Japan

malini
Download Presentation

CMIP5 / HI-RES AMIP, and an update on WGNE/WGCM Performance Metrics Panel

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CMIP5 / HI-RES AMIP, and an update on WGNE/WGCM Performance Metrics Panel Peter J. Gleckler Program for Climate Model Diagnosis and Intercomparison (PCMDI) Lawrence Livermore National Laboratory Presented to 26th Session of the Working Group on Numerical Experimentation Tokyo, Japan 18-22 October 2010

  2. What made the difference in CMIP3? Investment in infrastructure and development of standards • Community-developed metadata conventions • The “Climate-Forecast” metadata convention (CF) • Software to ensure data complies to conventions • The Climate Model Output Writer (CMOR) • State-of-the-art data delivery methods • The Earth System Grid (ESG)

  3. CMIP5 output will be served by federated centers around the world and will appear to be a single archive

  4. AMIP ensemble Future “time-slice” ensemble AMIP (1979-2008) future “time-slice” (2026-2035) AMIP SSTs with 4XCO2 aqua planet (clouds) patterned ΔSST (clouds) uniform ΔSST (clouds) CMIP5 Atmosphere-Only Experiments (targeted for computationally demanding and NWP models) Core: 40 yrs Tier 1: ≥185 yrs Tier 2: 30 yrs

  5. For each suite of experiments: model participation and model latitude resolution Requested output: 2.3 Pbytes

  6. CMIP5 participation - of potential interest to WGNE? A theme related to resolution, led by NWP contributions? • AMIP simulations: • an ‘ultra-high’ resolution run(s) • an ensemble of ‘traditional climate’ resolution (~CMIP5 coupled) • Aqua-planet experiments • Transpose AMIP

  7. CMIP5 participation - of potential interest to WGNE? • Reminder: AMIP, aqua-planet intercomparison and AMIPTorigins are with WGNE • Opportunity to engage with coupled modeling community, and have an important impact (high resolution importance?) • Substantial infrastructure to exploit • Data would be very visible to 1000’s of diagnosticians • Its reasonable to think beyond AR5 deadlines • A ‘champion’ is needed to coordinate NWP center contributions – not a big effort

  8. Climate Model Performance Metrics

  9. IPCC Meeting on Assessing and Combining Multi Model Climate Projections Boulder, Jan 2010. • Objective: To summarize methods used in assessing the quality and reliability of climate model simulations and in combining results from multiple models • Impact • Intended as a guide for future IPCC Lead Authors and scientists using results from model intercomparisons • “good guidance” recommendations practice in using multi-model ensembles for detection and attribution, model evaluation and global climate projections as well as regional projections relevant for impact and adaptation studies. • Illustrates the potential for, and limitations of, combining models for selected applications. IPCC moving beyond “one model, one vote”

  10. PCMDI-NASA meeting Oct 12-13Making NASA satellite data more useful for climate model evaluation Expert description of products relevant to climate models: • CERES - radiative fluxes (N. Loeb, LaRC) • CloudSat - Cloud profiles and properties (G. Stephens, JPL) • MISR – Clouds, winds, aerosols (D. Diner, JPL) • CALIPSO – cloud / aerosols (D. Winker, LaRC) • AIRS, AMSU-A Aqua – water vapor and T (E. Fetzer, JPL) • MODIS, Aqua/Terra – cloud products (S. Platnick, GSFC • TOMS, SBUV, OMI – Ozone (P. Bhartia, GSFC) • MLS, Aura - Atmosphere profiles (M. Santee, JPL) • TES, Aura - Atmospheric composition (K. Bowman, JPL)

  11. PCMDI-NASA meeting Oct 12-13Making NASA satellite data more useful for climate model evaluation PCMDI recommendations to NASA teams: • Expert judgment/choice of products versions • Technical alignment with CMIP5 data structure • Documentation relevant to model evaluation • Transparency – from dataset version to documentation • Data to be made available alongside CMIP5 model output • Effort focused on datasets directly comparable to CMIP5

  12. An update of theWGNE/WGCM Climate Model Metrics Panel • Panel initiated by WGNE, now a Joint WGNE/WGCM effort • Members, selected by relevant and diverse experience, and potential to liaison with key WCRP activities: • Beth Ebert (BMRC) – JWGV WWRP/WGNE • VeronikaEyring (DLR) – WGCM/SPARC/AC&C • Pierre Friedlingstein (U. Exeter) – IGBP (new member) • Peter Gleckler (PCMDI), panel chair – CMIP5, WGNE • Robert Pincus (NOAA) – GEWEX/GCSS • Karl Taylor (PCMDI) – CMIP5, WGCM • Helene Hewitt (Met Office) – WGOMD (new member)

  13. Questions motivating routine climate model metrics Of direct concern to the WGNE/WGCM metrics panel: Are models improving? How rapidly? Are some models more realistic than others? Other research drivers for climate model metrics How does skill in simulating observed climate relate to the credibility of projections? Can we justify weighting models, based on metrics of skill, to optimize use of multi-model ensembles in making projections of climate change?

  14. Metrics panel terms of reference (working version) OBJECTIVES: Identify and promote a limited set of routine metrics in an attempt to establish performance benchmarks for climate models Standard set Based on comparison with carefully selected observations Easy to calculate, reproduce and interpret Established in the peer-reviewed literature Covering a diverse suite of climate characteristics Emphasizing large- to global-scale measures of mean climate (and limited variability?) An expanded set Facilitate research & development of increasingly in-depth metrics via coordination with other WCRP activities

  15. Straw-man pathway: A metrics hierarchy Standard annual cycle metrics (nearing a beta version): Where appropriate, adhering to documented WMO NWP and seasonal prediction verification standards ~ 10-20 large- to global- scale statistical or “broad-brush” metrics Domains: Global, tropical, NH/SH extra-tropics 30 year climatologies: Annual mean, 4 seasons Metrics: centered RMS, bias, MSE, correlation, standard deviation Field examples: OLR, T850, SST… Observations: work-in-progress Extendedset of metrics: More targeted, quantifying skill in simulating important processes Examples stemming from other WCRP metrics efforts: Ocean (CLIVAR basin panels) ENSO (CLIVAR Pacific Panel) MJO (Task force) GCSS/CFMIP . ….

  16. Targeting WCRP benchmarks experiments Panel to focus its purview on metrics for WGNE/WGCM benchmark experiments: Emphasis is on: CMIP “historically forced” (“20C”) and AMIP simulations But the panelwill considermetrics for: Historical ESM experiments (e.g., 20th Century [CO2]) Initial condition experiment: evaluation in "NWP mode" (AMIPT)

  17. SAMPLE Metrics – ABSOLUTE SCORES Air Temperature at 850hPA (ta850) UPDATED: Thu Sep 9 10:57:40 PDT 2010 GLOBAL CLIMATOLOGICAL ANNUAL CYCLE (1980-1999) MODEL DJF MAM JJA SON ANN RootMeanSquare (REF = ERA40) gfdl_cm2_0 2.047 1.986 1.566 1.822 1.682 iap_fgoals1_0_g 4.451 5.957 6.173 4.492 4.693 cccma_cgcm3_1 1.774 1.738 1.827 1.418 1.348 ncar_pcm1 2.784 2.288 1.784 1.440 1.644 giss_model_e_h 2.482 2.224 1.830 1.838 1.868 giss_aom 2.011 1.587 1.784 1.546 1.422 miroc3_2_hires 1.726 1.479 1.516 1.545 1.408 ….

  18. Metrics panel terms of reference (working version)IMPLEMENTATION Ensure these metrics are applied in CMIP5 operationally (PCMDI has agreed to do this with panel oversight) Results accessible via the CMIP5 data portal Transparency: codes and observations to be made publicly available Metrics group web site (under development): Post results for all CMIP3 and CMIP5 simulations Clear description of the value/limitations Description of the panels terms of reference, etc. Relevant research and references for future metrics

  19. Timelines Early 2011: Panel identifies its “standard set” and observations to be used (beta) Modeling groups offered opportunity to provide feedback Coordinate withother WCRP activities, begin work on “extendedset” Standard metrics coded (PCMDI) and applied to CMIP3 database Panel webpage launched (initially permission-based) Mid 2011 WGNE/WGCM standard set “finalized” (WGNE/WGCM-Std-Metrics-V1.) Code and observations made available CMIP5 results computed and as they are submitted and made public after modeling group has reviewed results

  20. Oneimportant target – the IPCC AR5Journal articles submission deadline – 31 July 2012 What can we practically expect from the metrics panel effort? BAMS article describing the panel effort and results of standard metrics for CMIP5 (versus CMIP3). A limited number of ‘extended’ metrics: ENSO MJO WGOMD ocean metrics? ….

  21. WGCM discussed the metrics panel two weeks ago… • Get back to the model group why a grade is bad/good (i.e. find out reasons) • Danger of overestimate in the few diagnostics that are published • Publish the results on the metrics panel website and not on CMIP5 website • Let the model groups comment before release! • Focus on process-oriented diagnostics/metrics as much as possible • Conclusions: * password protected website and iterate with the model groups first before releasing to the public

More Related