+ All Categories
Home > Documents > Page 1© Crown copyright 2005 Using metrics to assess ocean and sea ice simulations Helene Banks,...

Page 1© Crown copyright 2005 Using metrics to assess ocean and sea ice simulations Helene Banks,...

Date post: 31-Dec-2015
Category:
Upload: elvin-harvey
View: 215 times
Download: 1 times
Share this document with a friend
16
© Crown copyright 2005 Page 1 Using metrics to assess ocean and sea ice simulations Helene Banks, Cath Senior, Jonathan Gregory Alison McLaren, Michael Vellinga + input from many Hadley Centre colleagues WGOMD August 2007
Transcript

© Crown copyright 2005 Page 1

Using metrics to assess ocean and sea ice simulations

Helene Banks, Cath Senior, Jonathan Gregory

Alison McLaren, Michael Vellinga

+ input from many Hadley Centre colleagues

WGOMD August 2007

© Crown copyright 2005 Page 2

Outline

History of model assessment at the Hadley Centre representative of other centres?

Proposed way forwardconsistent with other community initiatives?

© Crown copyright 2005 Page 3

HadCM3: ‘Ad-hoc’ analysis

Main focus on top of the atmosphere balance, SST drifts and heat transports

Analysis was ad-hoc

Model is still being used for lots of applications

SST drifts: years 81-120

Gordon et al., 2000

© Crown copyright 2005 Page 4

HadGEM1: More formal acceptance criteria

Acceptance criteria introduced Scientific Credibility

Eg, Conservation of mass, energy and water Scientific benchmarks relevant to ocean:

Net TOA flux in balance in control run to better than 0.5 W/m2 Surface air temperature and SST drifts comparable to HadCM3 Global mean SST error less than 0.5 K, local SST errors less than

2 K except in regions of sharp gradients, and overall SST and SSS errors superior to HadCM3.

Oceanic circulation stable, with accurate THC strength (NATHC = 20 +/- 5 Sv).

Oceanic poleward heat transport within 20% of observed estimates.

Wintertime sea ice extents within 20% of observed estimates in both hemispheres.

Oceanic water mass (T and S) drifts better than HadCM3.

Introduction of the Climate Prediction Index (CPI)

© Crown copyright 2005 Page 5

CPI used for HadGEM1 (Johns et al., 2006)

© Crown copyright 2005 Page 6

HadGEM2: Beginning to use metrics more formally as part of model development

Focussed on improving ENSO

Acceptance criteria: ENSO criteria

Measures of the tropical mean basic atmosphere and ocean state and ENSO performance to place the model within the pack of leading (IPCC AR4) models in most if not all respects (judged relative to observations)

ENSO simulation judged to be competitive with HadCM3 and GloSea including its skill in idealised predictability experiments (as judged by the seasonal forecasters)

CPI criteria No CPI elements judged to be significantly worse than HadGEM1 (i.e.

key scientific improvements identified in HadGEM1 over HadCM3 and captured within the CPI should be preserved)

Overall CPI judged to be as good as or better than HadGEM1 Other scientific criteria

TOA radiative imbalance, energy/mass/freshwater budgets, and magnitude of resultant coupled drifts in the control run judged to be no worse than HadGEM1

Any other key scientific improvements identifiable in HadGEM1 over HadCM3 judged to be preserved [e.g. MJO]

Model to be judged at least equally suitable for general climate variability studies as HadCM3

Substantially increased rainfall over the Indian subcontinent in the summer monsoon compared to HadGEM1 (in the mean) is desirable

© Crown copyright 2005 Page 7

‘Guilyardi’ plot

Sarah Ineson

© Crown copyright 2005 Page 8

HadGEM3: Requirements

HadGEM3 is being developed now

Take a top down approach to assessment

Requirements for the model defined

These requirements are translated into assessment areas:

1. Conservation

2. Global circulation•Atmosphere•Radiative balance•Hydrological cycle•Ocean•Sea ice•Land surface•Stratosphere

3. Regional variability•Regional predictions •Monsoon•ENSO•MJO•NAO•Extremes

4. Seasonal to decadal

© Crown copyright 2005 Page 9

Assessment criteria

In each assessment area, we are defining metrics to assess the simulations and underlying processes (i.e., is it the right answer for the right reasons)

As far as possible, use ‘community’ metrics

The metrics are generally not ‘new’

This approach provides a framework for objectively defining the assessment

Move towards a common presentation-summarise each area in (for example) a bar chart to allow a full assessment of the model

© Crown copyright 2005 Page 10

What is good enough?

Assess against best observations (or leading models in absence of observations)

No pass/fail level

‘Comfort zone’ definedFuzzy level based on uncertainties in observations/pack of leading models

0102030405060708090

100

Extent Area Thick Transport

High

Medium

Low

Comfort zone

© Crown copyright 2005 Page 11

HadGEM1 sea ice spin up

© Crown copyright 2005 Page 12

Proposed sea ice variables to assess

March/Sept NH/SH ice extent & area

Month of maximum/minimum NH/SH ice extent

Seasonal amplitude of NH/SH ice extent

RMS difference of winter ice concentration

Ice extent in selected regions

RMS of central Arctic ice thickness

Gradient of Arctic ice thickness

Maximum ice thickness in NH/SH

RMS ice speed in NH/SH winter

Transport across selected Straits

Annual mean northwards ice transport in SH

Summary in a bar chart

cf McLaren et al., 2006

© Crown copyright 2005 Page 13

Proposed ocean variables to assess

Temperature (SST, X-secns/collocations)

Salinity (SSS, X-secns/collocations)

Mixed layer

Currents (EUC, ACC, Arctic transports)

Upwelling (Equatorial, Basin upwelling)

Ocean transports (heat, freshwater)

Water masses (T-S, formation)

Budgets (conservation, surface fluxes)

MOC (overflows, transports, etc)

SSH (mean, anomaly)

Mesoscale features (eddy ke, TIWs, Gulf Stream separation, Agulhas)

© Crown copyright 2005 Page 14

Issues with ocean assessment

Drift is an issue especially for assessing T-S properties of water masses Runs of different lengths Can we make an ‘educated guess’ on whether there will be a

long-term impact of drifts

The use of neutral densities in observational estimates Not easily applied to models

How to combine a large number of criteria into something ‘digestable’-possible use of skill scores (under discussion-Vellinga, Williams, Sexton)

© Crown copyright 2005 Page 15

Other community efforts

WGNE workshop on metrics-San Francisco early 2007

PCMDI effort???

GODAE-Eric

GSOP-Detlef

© Crown copyright 2005 Page 16

Ideas that might be useful for WGOMD/ SOPHOCLES

Who are the users and what do they need the model to get right? Eg, Carbon uptake in Southern Ocean

What are the appropriate metrics? Metrics should assess processes as well as answer

What are the levels of acceptability?

How to present assessment in an objective manner?


Recommended