Intercomparisons Working Groupe activities - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

Intercomparisons Working Groupe activities

Description:

The vertical eddy diffusivity (kz, in m2/s): if compressed, first in LOG10! ... Surface downward heat flux in air (W/m2) Ancillary data: ... – PowerPoint PPT presentation

Number of Views:16
Avg rating:3.0/5.0
Slides: 23
Provided by: god5
Learn more at: https://www.godae.org
Category:

less

Transcript and Presenter's Notes

Title: Intercomparisons Working Groupe activities


1
Intercomparisons Working Groupe activities
Prepared by F. Hernandez K. Lisaeter, L. Bertino,
F. Davidson, M. Kamachi, G. Brassington, P. Oke,
A. Schiller, C. Maes, J. Cummings, E. Chassignet,
H. Hulburt, P. Hacker, J. Siddorn, M. Martin, S.
Dobricic, C. Regnier, L. Crosnier, N. Verbrugge,
M. Drévillon, J-M Lellouche
  • Status of the intercomparison exercice
  • Some exemples of diagnostics based on Class 1/2

2
Status of the intercomparison exercice
  • Methodology decided
  • Compare operational/dedicated hindcast from
    Feb-March-April period.
  • Consistency and quality assessment (not
    performance)
  • Intercomparison based on Class 1 and Class 2
    metrics, and reference data
  • Files shared on OpenDap/FTP, assessment performed
    by different teams on dedicated ocean basins
  • Preliminary work performed
  • Intercomparison plan endorsed
  • Technical implementation documents (metrics
    definition) written and distributed

3
The validation  philosophy 
  • Basic principles. Defined for ocean hindcast and
    forecast (Le Provost 2002, MERSEA Strand 1)
  • Consistency verifying that the system outputs
    are consistent with the current knowledge of the
    ocean circulation and climatologies
  • Quality (or accuracy of the hindcast) quantifying
    the differences between the system best results
    (analysis) and the sea truth, as estimated from
    observations, preferably using independent
    observations (not assimilated).
  • Performance (or accuracy of the forecast)
    quantifying the short term forecast capacity of
    each system, i.e. Answering the questions do we
    perform better than persistency? better than
    climatology?
  • A complementary principal, to verify the interest
    for the customer (Pinardi and Tonani, 2005, MFS)
  • Benefit end-user assessment of which quality
    level has to be reached before the product is
    useful for an application

4
Metrics definition (MERSEA heritage)
  • CLASS1 like Regular grid and few depth, daily
    averaged
  • Comparison of the 2D model surface SST and SLA
    with
  • -SST
  • -SLA
  • -SSM/I Ice concentration and drift for
    Arctic and Baltic areas
  • Comparison of each model (T,S) with
    climatological (T,S, mixed layer depth) at
    several depth (0m, 100m, 500m, 1000m )?
  • CLASS2 like High resolution vertical sections
    and moorings
  • Comparison of the model sections with Climatology
    and WOCE/CLIVAR/OTHER/XBT
    hydrographic
    sections
  • Comparison of the model SLA at tide gauge
    location, of the model (T,S,U,V) at fixed mooring
    locations
  • CLASS3 like Physical quantities derived from
    model variables
  • Comparison of the model volume transport with
    available observations (Florida cable
    measurments.)
  • Assessment through integrated/derived quantities
    Meridional Overturning Circulation, Warm Water
    Heat Content etc.
  • CLASS4 like Assessment of forecasting
    capabilities
  • Comparison between climatology, forecast,
    hindcast, analysis and observations
  • Comparison in 15x15degree boxes/dedicated boxes
    of each model with
  • T/S CORIOLIS, SSM/I Sea Ice concentration, tide
    gauges
  • SST High resolution ?
  • SLA AVISO ?

5
Class 2/3 MERSEA/GODAE GLOBAL METRICS Online
Systematic Diagnostics
6
  • Compute Class4 statistics
  • per geographical boxes or in regular 5x5degree
    boxes
  • per vertical layers (0-100m, 100-500m, 500-5000m?)

Elementary box patchwork
7
Class 4 based on Sea-Ice in the Barents Sea
TOPAZ sea-ice vs SSM/I data. RMS of the ice
concentration error (model-observation) over a
box in the Arctic Ocean. Analysis is compared to
forecast and persistence over a 10-day window
8
Status of the intercomparison exercice
definition of metrics
  • New description of Class 1, Class 2 and Class 3
    metrics
  • Regional areas revisited to fit recommendations'
  • Complete description of mooring, sections etc
  • Up-grade NetCDF files definition to be consistent
    with COARDS CF1.2 conventions
  • Include sea-ice variables in the definitions
  • Saving half storage capacity by use of
    compressed NetCDF files (data written short
    instead of floats, using scale_factors)
  • Proposition of a set of reference data
    (availability, access)

9
Status of the intercomparison exercice
definition of metrics
  • Class 1 definition (provided with fortran
    programs)

0 30 50 100 200 400 700 1000 1500 2000 2500 3000
10
Status of the intercomparison exercice
definition of metrics
  • Class 1 definition (provided with fortran
    programs)
  • 2D fields
  • The zonal and meridional wind stress (Pa) on top
    of the ocean,
  • The total net heat flux (including relaxation
    term) (W/m2) into the sea water,
  • The surface solar heat (W/m2) into the sea water,
  • The freshwater flux (including relaxation term)
    (kg/m2/s) into the ocean,
  • The Mixed Layer Depth (henceforth MLD) (m). Two
    kinds of MLD diagnostics are provided, to be
    compliant with de Boyer Montégut et al., 2004
    and D'Ortenzio et al., 2005. A temperature
    criteria MLD(?) with temperature difference with
    the ocean surface of T0.2C. And a surface
    potential density criteria MLD(?) with a 0.03
    kg/m3 surface potential density criteria.
  • The Sea Surface Height (SSH) (m).
  • 3D fields
  • The potential temperature (K) and salinity (psu).
  • The zonal and meridional velocity fields (m/s).
  • The vertical eddy diffusivity (kz, in m2/s) if
    compressed, first in LOG10!

11
Status of the intercomparison exercice
definition of metrics
  • Class 1 definition (provided with fortran
    programs)
  • 2D fields (for ARC, ACC, NAT, NPA and GLO)
  • Sea-Ice thickness (m)
  • Sea-Ice concentration (fraction)
  • Sea-Ice x and y velocities (m/s)
  • Surface snow thickness over sea ice (m)
  • Sea ice downward x and y stress (Pa)
  • Tendency of sea ice thickness due to
    thermodynamics (m/s)
  • Surface downward heat flux in air (W/m2)
  • Ancillary data
  • The Mean Dynamic Topography (henceforth MDT) (m)
    used as a reference sea level during the
    assimilation procedure. MDT is also called Mean
    Sea Surface Height (MSSH).
  • Climatologies of Sea Surface Temperature (SST)
    (K), of surface current (m/s), of MLD (m).
  • Climatology of potential temperature (K) and
    salinity (psu) fields from (T,S) used as a
    reference.

12
Status of the intercomparison exercice
definition of metrics
  • Class 2 mooring/sections
  • potential temperature (K) and salinity (psu).
  • zonal and meridional velocity fields (m/s).
  • Sea Surface Height (SSH) (m).

13
Status of the intercomparison exercice
definition of metrics
straight sections (yellow) XBT sections (brown)
gliders sections (purple) tide gauges (blue),
and other moorings (red).
78 vertical levels (WOA and GDEM3.0 standard
levels
14
Status of the intercomparison exercice
definition of metrics
  • Class 3 definition (transport)

In black, sections without specific class of
computation on the vertical. Transport computed
with classes temperature (red), salinity
(yellow), density (blue) and depth (green).
15
Status of the intercomparison exercice
assessment through Class 1-2-3 metrics
  • Consistency Monthly averaged fields compared to
  • WOA2005, Hydrobase, CARS, MEDATLAS, Janssen,
    climatologies
  • De Boyet Montégut MLD climatology
  • SST climatology
  • Quality Daily fields compared to
  • In situ data (Coriolis data server)
  • Dynamic topography, or SLA (AVISO products)
  • SST (depending on groups)
  • SSM/I Sea-Ice concentration and drift products
  • Surface currents (DBCP data, OSCAR, SURCOUF
    products)

16
Status of the intercomparison exercice Where are
we ?
  • Partners involved / status

Status of 3 months hindcast products OpenDAP or FTP addresses Partners main regions of interest
MER Produced TBC NAT, TAT, SPA, TPA, IND, ARC
UKM TBC TBC tbc
TOP TBC TBC tbc
BLK Produced BlueLink OpenDAP  TBC IND, SPA, tbc
MFS TBC TBC tbc
MRI Produced at MRI, provided to University of Hawaii Univ of Hawaii OpenDAP  TBC NPA
HYC TBC TBC tbc
CNF Produced, need to transition to new NetCDF format FTP server of CNOOFS North West Atlantic
17
Status of the intercomparison exercice Where are
we ?
  • Agenda
  • Shift (one month now) for availability of
    products
  • Not clear view of intercomparison strength of
    work in the different areas (ie how many groups
    plan a dedicated work looking at more than their
    own hindcast? )
  • Target define a deadline to be prepared for the
    Symposium
  • Validation and intercomparison of analysis and
    forecast products
  • F. Hernandez (Mercator-Ocean), G. Brassington
    (BoM), J. Cummings (NRL), L. Crosnier
    (Mercator-Ocean), F. Davidson (DFO), S. Dobricic
    (ICMCC), P. Hacker (Univ. of Hawaii), M. Kamachi
    (JMA), K. A. Lisæter (NERSC), M. Martin (UK Met
    Office)
  • Availability of products (end of July ?????)
  • Availability of intercomparison results (mid
    October ????)
  • Managing the outcomes
  • How do we take profit from feedbacks ?
  • Initiative to keep on this activity?

18
Assessment diagnostics
SST
SST-WOA05
NOAA RTG SST
SST - RTG
19
Assessment diagnostics
20
Assessment diagnostics
Salinity-WOA05
Salinity
Surface currents comparison to drifters
21
Assessment diagnostics
22
Assessment diagnostics
23
Assessment diagnostics
24
Assessment diagnostics
25
Assessment diagnostics
26
Assessment diagnostics
27
Status of the intercomparison exercice Where are
we ?
  • Agenda
  • Shift (one month now) for availability of
    products
  • Not clear view of intercomparison strength of
    work in the different areas (ie how many groups
    plan a dedicated work looking at more than their
    own hindcast? )
  • Target define a deadline to be prepared for the
    Symposium
  • Validation and intercomparison of analysis and
    forecast products
  • F. Hernandez (Mercator-Ocean), G. Brassington
    (BoM), J. Cummings (NRL), L. Crosnier
    (Mercator-Ocean), F. Davidson (DFO), S. Dobricic
    (ICMCC), P. Hacker (Univ. of Hawaii), M. Kamachi
    (JMA), K. A. Lisæter (NERSC), M. Martin (UK Met
    Office)
  • Availability of products (end of July ?????)
  • Availability of intercomparison results (mid
    October ????)
  • Managing the outcomes
  • How do we take profit from feedbacks ?
  • Initiative to keep on this activity?
Write a Comment
User Comments (0)
About PowerShow.com