Download as pdf or txt
Download as pdf or txt
You are on page 1of 23

Unified Model Documentation Paper C02

Coupled Models Technical Overview

UM Version : 10.1
Last Updated : 2014-12-05 (for vn10.0)
Owner : Richard Hill

Contributors:
R. Hill,C. Harris,D. Pearson

Met Office
FitzRoy Road
Exeter
Devon EX1 3PB
United Kingdom

c Crown Copyright 2015


This document has not been published; Permission to quote from it must be obtained from the Unified Model
system manager at the above address
UMDP: C02
Coupled Models Technical Overview

Contents
1 Introduction 2

2 Component Models 2

3 Couplers and NEMO-CICE 3

4 Compilation 3

5 Data structures 4

6 Coupling fields 4
6.1 Ocean to Atmosphere . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
6.2 Atmosphere to Ocean . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

7 Coupling transformations 5

8 MPP Issues 5
8.1 Data Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
8.2 Coupling Transformations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
8.3 MPI Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

9 Code structure 6

10 Atmosphere Grid Considerations 8

11 Multiple-category ice thickness distribution(ITD) scheme 9

12 What happens at run time 10

13 Couplers and the UM IO Server 12


13.1 OASIS3 and the UM IO Server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
13.2 OASIS3-MCT and the UM IO Server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

14 Date handling and calendar considerations 13

15 Climate meaning and restarts 13

16 Error Handling in Coupled Components 14

17 Figures 16
17.1 Figure 1. Top Level Atmos - NEMO-CICE Ocean coupling code . . . . . . . . . . . . . . . . . . 16
17.2 Figure 2. Sequence of events in coupled model . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

18 Tables 18
18.1 Table 1. Coupling CPP keys in the UM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
18.2 Table 2. NEMO-CICE Ocean to atmosphere coupling fields . . . . . . . . . . . . . . . . . . . . . 19
18.3 Table 3. Atmosphere to ocean coupling fields . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

19 References 22

1 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

1 Introduction

This paper is an overview of the method used to couple atmosphere and ocean models involving the Unified
Model system. Descriptions are confined to technical aspects of software components: no scientific issues or
rationale for the adopted coupling technique is given here.

A requirement of the Unified Model system is that the atmosphere model can run independently, or in a parallel
coupled context. For independent integrations, state variables such as sea-surface temperature in the atmo-
sphere model are supplied as ancillary fields derived from external files. In a coupled model context, these
constitute coupling fields which are supplied from the ocean model integration and can be derived from either
prognostic or diagnostic variables.

Coupling to external ocean models is achieved using the OASIS family of couplers. At UM9.1 this covers the
OASIS3 pseudo-parallel and mono-process coupling approaches and the OASIS3-MCT (vn2.0) parallel cou-
pler. Coupling requires the concurrent running of separate atmosphere and NEMO-CICE executables and, in
the case of OASIS3, a separate OASIS3 executable. OASIS3-MCT does not require its own executable since all
the coupling operations are performed in the interface library routines attached to component models. Although
the UM is in effect simply an atmosphere-only model, coupled models make use of the UM control systems
(Rose user interface, UM compilation system, UM job submission and control) to control the running of the
model. The atmosphere and NEMO-CICE components exchange coupling data via the intermediate OASIS3
transformer or OASIS3-MCT coupling interface.

Whereas prior to UM7.0 a coupled model consisted of code from a single repository, modern UM coupled mod-
els consist of code from several separate repositories, potentially; The UM atmosphere and associated com-
pilation and job control functionality, JULES land surface, FLAKE lake model, the OASIS coupler, the NEMO
ocean model and the CICE sea-ice model.

The coupling interface supports the grid types of both the atmosphere New Dyamics and ENDGAME dynamical
cores.

Although OASIS3-MCT is now available for use in the UM, it is still relatively new and OASIS3 is still the coupler
used most commonly. OASIS3-MCT is expected to be adopted more widely for very high resolution models or
for models which exchange large amounts of coupling data as the potential for standard OASIS3 to become a
bottleneck increases.

The UM IO Server code may be run in conjunction with the OASIS3 coupler, with certain limitations (IO server
processes must be defined in a single contiguous range of PEs after all the atmosphere processes have been
defined. Since the IO server processes would optimally be distributed amongst the atmosphere PEs, this ap-
plication is of limited, if any use.) At UM8.5, support was introduced to allow the UM IO server code to run in
conjunction with the OASIS3-MCT coupler.

Support is provided in UM to couple to either NEMO 3.2 (and in theory also 3.0 although this is no longer rou-
tinely tested or supported) or NEMO 3.4 (and in theory also 3.3.1 although this version has never been used
operationally).

From UM version 9.1 onwards, if using OASIS3-MCT as the coupler, then version 2.0 onwards must be em-
ployed (to cater for the option of exchanging gradient terms in 2nd order conservative regridding.)

2 Component Models

The UM atmosphere, JULES, OASIS and NEMO/CICE ocean/sea-ice code reside in completely separate repos-
itories. NEMO, CICE and OASIS3 development and release cycles are in no way connected to UM (or JULES

2 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

or FLAKE) release cycles, or indeed to each other. The common factors are that they may all use the same
coupler and that they may all be controlled in terms of compilation and running through the UM Rose system.

3 Couplers and NEMO-CICE

The coupler libraries and, in the case of OASIS3, the driver executable, must be built and ready for use before
the UM rose suite is submitted. i.e. the coupler code is not compiled as part of the UM build and submission
system.

Ordinarily, there should be no need for users to build the coupler libraries or executables since centrally built
versions, compatible with each UM release are available. Rose allows the user to specify the location of the
particular coupler build which is to be employed by means of the “prism path” setting in the component “Include
and library path”.

NEMO-CICE component configuration details are specified in Rose entries, along with any driving namelists
and netCDF input files.

4 Compilation

Coupling code in the UM atmosphere is controlled by run-time logicals, e.g. L OASIS or protected by CPP keys:

#if defined(OASIS3)
#endif

#if defined(MCT)
#endif

Other compile (and run-time) switches within coupling routines are given in Table 1.

The compilation requires the OASIS and netCDF libraries, (and for the UM atmosphere, GCOM libraries.) In ad-
dition to specifying the location of the coupler libraries under “prism path”, the netCDF library location “netcdf -
lib path” and include file location “netcdf inc path” should be set to employ netCDF version 3.6.0-p1 (or some
other suitable build of netCDF). These items are, for the UM, set in:

fcm make um => env => Include and library paths for the build
(e.g. when running on the Met Office IBM system, using /opt/netcdf/netcdf-3.6.0-p1 ec/lib and /opt/netcdf/netcdf-
3.6.0-p1 ec/include).

In addition to the UM atmosphere component executable, compilation of the separate NEMO-CICE executable
is controlled through the Rose suite. The equivalent “prism path”, “netcdf lib path” and “netcdf inc path” settings
should be specified for NEMO-CICE in the same way in the Rose entries:
fcm make ocean => env => Include and library paths for the build
It is important that these match the settings used by the UM atmosphere.

3 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

5 Data structures

State variables for a coupled model are held in the native file format for each component model. i.e. UM format
model dumps for the atmosphere component, netCDF restart files for the NEMO component and binary restart
files for the CICE data. Refer to NEMO and CICE source code and documentation at the relevant release for
details of restart file structure and contents (see references for web addresses).

The atmosphere dumps define primary initial fields for the start of the integration, consisting of prognostic and
ancillary fields. Forecast dumps after integration are written out periodically with the same format, except that
accumulated, mean and other diagnostic fields are written at the end of each file if required to allow restarts to
be initiated from interim dumps. The complete set of fields is accompanied by a description of each field, held
in lookup tables.

Data from the initial atmosphere dump is read sequentially as a series of horizontal fields into a large array - the
D1 array.

The D1 array in the atmosphere model contains primary and diagnostic fields from the corresponding dump, but
also secondary space for intermediate fields, which can include coupling variables. A complete set of pointers
to each field within D1 is calculated by the STASH addressing routines. Access to coupling fields is obtained by
finding the appropriate pointers to D1. Coupling fields derived from prognostic variables are simply accessed
from the primary pointers calculated in routine SET ATM POINTERS. Typically these are incoming fields, that is,
coupling fields sent to the atmosphere (via the coupler) from the NEMO-CICE ocean. For diagnostic variables,
access to coupling fields is achieved through searching a STASH array (STLIST) for locations of the correct
fields. Typically these are outgoing fields, that is, coupling fields sent from the atmosphere (via the coupler) to
the NEMO-CICE ocean or elsewhere. These fields are usually introduced into the job by the addition of what in
UMUI terms would have been referred to as a STASH macro for the coupling fields. Under Rose, this is simply
achieved by the addition of extra STASH items and approriate domains to any existing list of diagnostic fields. A
STASH tag (10 for Atmosphere->Ocean) ensures unique identification and distinction of “coupling diagnostict”
from “standard diagnostics.” [See UMDP-C04 : STASH] Note that “coupling diagnostics” are effectively prog-
nostic fields since they hold values which will be be used to drive ocean and other components, which in turn
will feed back to the UM atmosphere.

It is also important to note that alternative coupling fields, representing ostensibly the same physical quantity,
can be employed, dependent on the atmosphere model’s scientific configuration. This means that the source
or destination of a particular coupling field can be one of several configuration dependent fields or areas of D1.
For instance, the surface wind stresses sent by the atmosphere to the coupler are completely different STASH
fields, dependent on whether coastal tiling is switched on or off.

All atmosphere fields involved in coupling are defined on the full horizontal domain, i.e. they are not compressed,
reduced in precision or packed in any way.

6 Coupling fields

6.1 Ocean to Atmosphere

A typical list of NEMO-CICE ocean to UM atmosphere coupling fields is given in Table 2. Fields used in coupling
tend to be prognostics or compound fields specially calculated for coupling in the NEMO-CICE model. Equiva-
lent variables in the atmosphere model component are primary variables. There is a one to one correspondence
between fields sent by the ocean and fields received by the atmosphere.

In global models, special processing is required by the atmosphere for incoming fields defined on the North
polar singularity. Under New Dynamics, all values at T points (i.e. scalar values) of each incoming coupling field

4 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

are explicitly set to a single mean value on the North polar row to avoid numerical problems. Additionally, fields
defined on U points (typically only surface currents) which are therefore also at the North polar row, are derived
from V values half a grid cell to the South. This is because OASIS tends not to deal with regridding values onto
the polar row at all well when left to its own devices.
Under ENDGAME, only fields on V points are defined at the polar singularity. Hence incoming surface currents
are derived from U values half a grid cell to the South. These polar row adjustments are not necessary in the
Southern hemisphere of global models, since no coupling data is exchanged over Antarctica. (Or perhaps more
accurately, any coupling data exchanged over land will be ignored.)

The number of fields involved in coupling, the names of fields and the way each field corresponds to internal
model fields is controllable by means of specially annotated namcouple files. See “What happens at run time”.

6.2 Atmosphere to Ocean

A typical list of ocean to atmosphere coupling fields is given in Table 3. In contrast to above, most outgoing cou-
pling fields are derived from atmosphere submodel diagnostics processed through the STASH meaning system.
Certain fields are used in combination to derive compound fields which are then used as output coupling fields.
For instance moisture and heat fluxes are calculated based on several separate component fields. Hence there
is not always a direct one-to-one correspondence between incoming NEMO-CICE fields and UM diagnostic or
prognostic fields.

7 Coupling transformations

Coupling transformations are defined, when using OASIS3 or OASIS3-MCT, by the contents of the “namcouple”
file(s). When employing the pseudo-parallel OASIS3 case, there must be a namcouple file present for each
instance of OASIS3, indexed by processor rank. e.g. if using four OASIS3 processes, coupling fields would
need to be distributed between four files, typically named namcouple 0, namcouple 1, namcouple 2 and nam-
couple 3. These files instruct OASIS3 which fields to exchange, the frequency of exchanges and the type of
transformations etc. The file(s) must be created prior to the model run. It is important that the file contents match
the fields and operations defined in the atmosphere and NEMO component models. From UM 8.3 onwards the
UM is able to interrogate these namcouple files at run time in order to ascertain and define which fields need to
be present in the run.
For atmosphere-ocean models, coupling only occurs at the ocean surface, therefore only 2-D surface fields are
involved.

Transformations are required to:


• Provide horizontal interpolation from the atmosphere to ocean grid and vice versa.
• Mask land points from the coupling with coastal adjustment of fields if needed.
• Combine multiple diagnostics into a single field.
• Change units (e.g. sign changes in precipitation, conversion from Kelvin to ◦ C).

8 MPP Issues

8.1 Data Structure

Each processor has a D1 array for the limited area of the domain dedicated to that processor. When using
OASIS3 it is possible to switch between performing atmosphere coupling through a single master processor or
directly through each sub-domain processor. There is no consistenly clear evidence that one or other approach

5 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

is superior in all configurations from a performance point of view. When using OASIS3-MCT, coupling may only
be done in parallel, through each sub-domain. There are no limitations on the MPP decompositions which may
be used for any of the components employed in coupling. i.e. components may employ a 1xM, Nx1, or MxN
decomposition.

Note that there is no requirement for the CICE and NEMO components to use the same decomposition config-
uration (i.e it is perfectly possible for NEMO to use a 2x3 decomposition while CICE uses a 6x1 decomposition)
but for large decompositions most configurations will choose to use the same decomposition for NEMO and
CICE to avoid internal coupling bottlenecks.

The UM currently supports the facility to run with NEMO versions 3.2, 3.3.1 and 3.4. Versions 3.2 and 3.4 have
been tested and used extensively. Version 3.3.1 has received little attention in the context of coupled model
testing and most operational models have jumped or will jump straight from vn3.2 to using vn3.4.

8.2 Coupling Transformations

OASIS3 effectively performs coupling transformations on a single CPU. If data is sent to it in parallel from sub-
domains of the atmosphere, this simply implies that OASIS3 has to perform an internal gather operation before
regridding and sending, potentially scattering, data to the ocean. In fact, we generally choose to gather data
explicitly in the atmosphere component prior to sending to OASIS3. The net overhead of this is minimal, but
it facilitates debugging since entire global fields are available for analysis prior to sending. Similarly, when re-
ceiving incoming coupling data, the atmosphere usually deals with global fields and explicitly scatters each field
across the appropriate sub-domains.

The use of multiple OASIS3 instances is supported when using a pseudo-parallel OASIS3 build. In this case,
each coupling field is assigned to a particular OASIS3 process in order to spread the cost of coupling. Hence
the theoretical maximum number of OASIS3 instances which could be employed is equal to the total number
of coupling fields. Even when using a single OASIS3 instance, the psuedo-parallel build of OASIS3 is usually
employed.

When using OASIS3-MCT, there is no implied (or explicit) need to gather or scatter data to or from the global
domains.

8.3 MPI Issues

OASIS3 and OASIS3-MCT effectively manage the coupled model components in terms of defining local MPI
communicators for use by each of the components. It is important therefore that no component (or library used
by a component) should directly refer to the MPI COMM WORLD communicator in its own internal communica-
tions. Each component must limit its operations to the communicator prescribed by the coupler in order to avoid
deadlock conditions. This important and easily overlooked point needs to be given due consideration when
importing new functionality into the coupled model components.

The both couplers are only ever built using MPI1. OASIS3-MCT only supports MPI1 functionality and the
pseudo-parallel functionality of OASIS3 is only compatible with MPI1.

9 Code structure

Routines relevant to atmosphere-ocean coupling are shown in Figure 1.

6 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

Coupler interfacing initialisation and shutdown is called from UM SHELL. OASIS INITA2O, called via OASIS -
INITIALISE from INITIAL, sets up the STASH pointers to the diagnostics which will be used for outgoing at-
mosphere coupling fields. Pointers for diagnostic fields are found using the generic FINDPTR routine to locate
STASH tags. OASIS INITA2O also allocates arrays used to hold coupling data during the model run.

OASIS READ TRANSLIST, called from OASIS INITIALISE, reads a dynamically created namelist in order to
identify which coupling fields are required by the atmosphere component and how those fields relate to prognos-
tic and diagnostic model fields. This process, along with OASIS PROCESS TRANSLIST and OASIS POINT -
TRANSLIST also identifies the grid (U, V or T) on which each coupling field is defined, whether Polar Meaning
is required on a particular input field (based on grid type) and whether it is an input or output field.
OASIS3 GRID sets up the atmosphere grid definitions for OASIS3 or OASIS3-MCT. Separate grids need to be
defined for T, U and V points. It also defines the OASIS transient variables which are to be used to hold the sent
and received coupling data.

Inside the main atmosphere time step loop, the first thing that happens before ATM STEP is called, is that
coupling takes place, via OASIS3 GETO2A and OASIS3 PUTA2O. This means that the incoming NEMO-CICE
ocean data must be sufficient to initialise all the necessary driving fields for calculations to be performed on that
time step. The implication of this is that the NEMO-CICE component must start up with all the necessary data to
send to the atmosphere before it has performed any time stepping. Similarly, the atmosphere component must
start up with all necessary data to send to the NEMO-CICE ocean before it has performed any time stepping.

OASIS UPDATECPL copies coupling fields to prognostics prior to any potential dump creation. This ensures
dumps will contain the necessary data to restart and complete the first coupling exchange as detailed above.

OASIS TIDY is performed at the end of the run to deallocate the arrays which were originally allocated in
OASIS INITA2O

7 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

10 Atmosphere Grid Considerations

The UM atmosphere currently uses an Arakawa ‘C’ style grid. For a global grid, using the New Dynamics
dynamical core, these points are arranged as follows:
latitude

t u t .. u t +90.0

v v ..

t u t .. u t

.. .. .. .. .. ..

t u t .. u t 0.

.. .. .. .. .. ..

t u t .. u t

v v .. v

t u t .. u t -90.0

longitude 0. 360.

t=p,rho,theta,[w]

8 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

For a global grid, using the ENDGAME dynamical core, grid points are arranged as follows:
latitude

v v v +90.0

u t u t .. t u

v v .. v

.. .. .. .. .. .. ..

v v v 0.

.. .. .. .. .. .. ..

v v .. v

u t u t .. t u

v v v -90.0

longitude 0. 360.

t=p,rho,theta,[w]

Fields defined on u, v and t points must be defined on different grids as far as OASIS is concerned, with sepa-
rate land-sea masks and areas where appropriate.

Coupling fields, when sent to OASIS do not have any MPP halos. Although it is possible for OASIS to cater for
the eventuality where halos are present, it seems highly unlikely that a need for this will ever arise.

11 Multiple-category ice thickness distribution(ITD) scheme

Coupling to NEMO-CICE currently assumes that multi-category ice fields are always involved. Further, it as-
sumes five ice categories in all the relevant fields. Much of the coupling code is hard coded to assume this
- partly because of the need to define hard coded field names in OASIS control files. The UMUI atmosphere
allows the number of categories (NICE) to be set in the Sea Surface panel under the Boundary Layer and
Surface Processes options. Currently, this NICE value must be set to 5 in coupled models. If the ITD scheme
is not used, NICE defaults to 1 by definition. This NICE value is also passed as a cpp key (NICECAT) to the
NEMO-CICE component compilation in an attempt to ensure that the atmosphere and NEMO-CICE settings
are consistent. However, the fact that CICE uses static allocation means that consistency can only be achieved
if the NEMO-CICE component is recompiled if the NICE value needs to be changed (in normal circumstances,
NICE never needs to be changed.)
Developments are planned to allow for limited area coupled models which do not assume the presence of ice
model fields.

9 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

12 What happens at run time

There is a critical stage of pre-processing performed by the UM control scripts before the Fortran executables
can run. As well as performing general tidying up of any old files or directories left over from a previous run of
the same job, the script OASIS fields interrogates the namcouple file(s) in order to compose a list of transient
fields which will be involved in coupling. The details of these fields are written to a dynamically created namelist
file which is read at run time by the Fortran executable(s) and used to define which fields are needed by the cou-
pling, the order they should be exchanged in and the internal UM (or other component) field to which they relate.

In order for this to work correctly all namcouple files must be annotated with special information. The line pre-
ceeding the definition of each transient in the $STRINGS section of the namcouple file must contain notation in
keeping with the following:

# TRANSDEF: srcX trgY OO RR M ###


• # TRANSDEF: is just a standard string for identifying the special line. To OASIS, this line appears as a
comment and is ignored.
• src indicates the source component (OCN or ATM)
• X indicates the source grid type (T, U or V)
• trg indicates the target component (ATM or OCN)
• Y indicates the target grid type (T, U or V)
• OO indicates the order of field exchange - i.e. “16” appearing here would indicate that this would be the
16th field to be exchanged. This value must be unique. A negative value (e.g. -16) switches this field off,
excluding it from coupling definitions and exchanges. If this value is positive, it must have a unique value
within the full set of transients used by any of the namcouple files for a particular model.
• RR is an integer indicating the field reference with which the component models will link the transient field
to a particular internal model field. RR must have a unique value within the full set of active transients
used by any of the namcouple files for a particular model.
• M is an optional additional integer indicating whether 2nd order conservative coupling is to be employed
for this particular field when using the OASIS3-MCT coupler. If the OASIS3-MCT coupler is not being
used, then any value set here has no effect. If the OASIS3-MCT coupler is being used, then a value of “2”
indicates that 2nd order coupling is required and the UM will generate gradient terms which are employed
in the coupling exchanges. Any other value, or a complete absence of this value when OASIS3-MCT is
used indicates that 2nd order terms will NOT feature in the coupling exchanges.
For example:
# TRANSDEF: ATMT OCNT 1 71
heatflux HEATFLUX 5 10800 2 atmos_restart.nc EXPOUT
atm3 tor1 SEQ=+2
P 0 P 2
LOCTRANS SCRIPR
INSTANT
CONSERV LR SCALAR LATLON 50 DESTAREA FIRST
This indicates that field named “heatflux” has its source in the atmosphere (src=“ATM”) on the T grid (X=“T”.)
It is sent to a field named “HEATFLUX” whose target is the ocean (trg=“OCN”) on the T grid (Y=“T”.) The field
is active (because OO has a positive value) and is the first to be exchanged (OO=“1”.) It has a field reference
of 71 which is used internally by the component models to link this transient with a particular field. There is
no “M” value set here indicating that no 2nd order gradient terms will be calculated or exchanged even if the
OASIS3-MCT coupler is employed.

Note that the number of transient fields given by the $NFIELDS value in each namcouple file should include
all transient fields listed in that namcouple file regardless of whether each field is switched off by virtue of a
negative order (OO) value.

10 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

At run time, the separate executables - OASIS3 driver/coupler (if using OASIS3), UM atmosphere and NEMO-
CICE ocean - are started simultaneously. See figure 2 for a schematic of the sequence of events and the
interactions between the executables of the coupled model run.’ On IBM powerN systems, poe is used with
“-pgmmodel mpmd” to indicate a multiple process multiple data paradigm and “-cmdfile” to specify the instances
of each component to be executed. On NEC SX systems, mpiexec is used. (Different commands may apply on
different platforms.) The number of PEs for each component is specified at this stage. (OASIS3 theoretically
allows us to spawn processes but we do not use this since it offers no advantages in normal batch schedul-
ing situations). Refer to the OASIS3/OASIS3-MCT, NEMO and CICE user guides for details of the output files
expected. In particular, note that the couplers have the ability to generate netCDF files containing copies of
fields exchanged in the coupling process. These are very useful for debugging work, but are large and are not
recommended for use in operational or long runs since, aside from consuming large volumes of disk space,
their production will slow the model down considerably.

Note also that for each coupled remapping transformation (e.g. Atmos T points => NEMO T points, NEMO U
points => Atmos U points etc.) a separate remapping weights file is required. If these are not present at the start
of the run, then OASIS3/OASIS3-MCT will attempt to generate them as and when it first needs them. Allowing
the couplers to do this at run time is inadvisible. This process can be slow, the time taken being dependent on
the model resolutions and grid types involved. Additionally, the results are often less than satisfactory due to
difficulty the underlying SCRIP process has when dealing with the NEMO tri-polar grid. For both these reasons,
it is usual practise to generate remapping weights files off-line prior to a run, by hand. This ensures accurate
results and reduces run-time considerably.

The location of pre-generated weights files may be specified in the Rose user interface which ensures they are
linked to the appropriate UM job directory at run time. It is assumed that these files alwasy conform to the rmp*
naming convention, although with OASIS3–MCT it is possible to ascribe any name to a weights file. When using
the OASIS3-MCT “MAPPING” option, the weights file names must be specified in the namcouple and these files
must exist prior to the start of the run.

11 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

13 Couplers and the UM IO Server

The OASIS3 and OASIS3-MCT couplers may be employed in conjunction with the UM IO server (IOS) code.
However there are certain technical features and limitations which are worth noting, particularly when develop-
ing or modifying code associated with either the IOS or the couplers.

13.1 OASIS3 and the UM IO Server

OASIS3 demands that all component processes be arranged contiguously. One cannot normally interleave
atmosphere model processes with ocean model processes. When running with the IOS, one would usually
wish to position IOS processes at regular intervals throughout the atmosphere processes (e.g. one on each
atmosphere node) for optimal performance. However, this is not possible when running with OASIS3. In this
case all IOS processes must appear after the atmosphere processes. i.e. A valid process configuration would
be:

Global rank Component Name


0 OASIS3
1 OASIS3
2 ATMOS
3 ATMOS
4 ATMOS
5 ATMOS
6 ATMOS
7 IOS
8 IOS
9 IOS
10 IOS
11 OCEAN
12 OCEAN
13 OCEAN
14 OCEAN
The following type of arrangement is NOT possible with OASIS3
Global rank Component Name
0 OASIS3
1 OASIS3
2 ATMOS
3 ATMOS
4 IOS
5 ATMOS
6 ATMOS
7 IOS
8 ATMOS
9 ATMOS
10 IOS
11 OCEAN
12 OCEAN
13 OCEAN
14 OCEAN
There is no requirement for the IOS processes to issue calls to prism def partition, prism def var, prism enddef
or puts and gets, since they are not involved in any coupling operations other than OASIS3 startup and termi-
nation. One must take care, however, to ensure that the namcouple contents under $CHANNEL are correctly
set in order to define the number of processes involved in coupling operations as distict from the total number
of processe on which the entire UM (atmosphere+IOS) is running.

12 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

13.2 OASIS3-MCT and the UM IO Server

When employing OASIS3-MCT, it is possible to position OASIS processes either at regular intervals among the
atmosphere processes or, as in the OASIS3 case above, at the end of the atmosphere processes. (However it
is not currently possible to position atmosphere IOS processes among the ocean processes!)

OASIS3-MCT, however, demands that prism def partition, prism def var and prism enddef calls be performed
collectively. That is, they must be called on all processes, even those which are not involved in coupling. This is
achieved by creating alternative paths through the routine OASIS3 grids. One path is taken by those processes
which are involved in coupling and one path is taken by processes which are not involved in coupling, merely to
facilitate the null calls to prism def partition, prism def var and prism enddef. OASIS3-MCT has to be informed
of the appropriate communicator to use in coupling via prism create couplcomm (puts and gets are not collec-
tive on non-coupling PEs.) See the OASIS3-MCT user guide for more details.

14 Date handling and calendar considerations

Coupled climate models generally employ a 360-day calendar in order to define each month and season with a
standard length.

The coupled model may also be run using the Gregorian calendar, as is typically done in seasonal forecast
models.
The run length specified in the UM Rose interface is written directly to the UM namelist (&NLSTCALL). The
NEMO and CICE namelists are edited by the scripts qsnemosetup and/or qscicesetup which apply the run
length using specially defined environment variables.

These scripts use UMScr Time2Days, UMScr Days2Time and UMScr CalcIncrementDays to calculate the run
length in the current calendar (360, 365 or Gregorian). These scripts are based on UM version 7.9 FORTRAN
routines: time2sec.F90 and sec2time.F90. They use units of days to avoid integer overflows in the shell script
calculations.

If switching from 360-day to Gregorian calendar when converting from a 360-day calendar to a Gregorian cal-
endar run there are a few things which must be considered:
• Ancillaries are generated for either a 360-day or Gregorian calendar, the UM will fail with an error if the
calendar in the job and the ancillary do not match. The header of the ancillary can be changed to designate
a new calendar, however the time series of the data may then not be valid (i.e. 360-day ancillary may
include the 30th Feb).
• The UM must be set to daily dumping to allow climate meaning to function correctly.
NEMO is not able to calculate Gregorian calendar monthly means. These must be calculated post-model-run
from daily data. (The exception to this is when a run starts on the first day of a month and resubmits every
month, in which case it should in theory be possible, although this is a largely unproven area of functionality.)

CICE can calculate Gregorian (or other) calendar monthly means, provided there is no mid-month restart in the
run.

15 Climate meaning and restarts

Standard restart behaviour is for the atmosphere model to always try to restart from the most recent 10 day
dump, even when longer period climate means are active. For the standard NEMO set-up, which only allows
ocean climate meaning on the resubmission period, this potentially causes problems for jobs running with 1

13 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

month resubmission (because the appropriate monthly means won’t be correctly written in the case of a mid-
month restart). To mitigate against this, the qsnemosetup script backs up the atmosphere restart dump and
history files at the start of each month. Attempting to restart part way through a month will generate an error
and suggest these back-up files are used to restart the run instead.

Functionality has been provided in NEMO which, provided an appropriate NEMO branch (for example the com-
monly used “vn3.2 separate mean” branch) is applied, will allow NEMO climate meaning in periods which differ
from the resubmission period. This is activated by the setting “NEMO SEPARATE MEAN” entry in the coupled
“Environment variable configuratio” entries. The user should also make sure both NEMO and CICE are set
to write restart dumps every 10 days, with diagnostics written as 10 day means. It is then possible to restart
correctly regardless of resubmission period (assuming a multiple of 10 days), and in this case qsnemosetup no
longer backs up atmosphere dumps and history files at the start of each month as these should not be required.

16 Error Handling in Coupled Components

Independently owned and developed component model codes are at liberty and likely to employ their own
method of error handling. These may, for instance, involve the use of MPI ABORT, a simple FORTRAN STOP
statement or, in the case of the UM, GC ABORT.

However, having typically been developed with a view to running each component in isolation, these mech-
anisms often pay little regard to the fact that there may be other components running concurrently within an
MPMD configuration. These error handling mechanisms may merely deal with terminating processes owned
by that particular component. Under such circumstances, other components will remain effectively deadlocked
until they run out of time. This can then lead to confusion about the cause of model failure, with the apparent
reason being reported superficially as a lack of CPU time when in fact the real cause is something more tangible.

So in order to shut down coupled configurations cleanly when an error condition is detected, and to avoid the
unnecessary consumption of resources by deadlocked components, it is necessary to ensure that each compo-
nent is capable of shutting down all processes belonging to all components.

The OASIS3 and OASIS3 MCT coupler PSMILes provide routines called prism abort proto and oasis abort
which may be used by any component to voluntarily abort all components at the same time. Essentially these
calls act as a wrapper for MPI ABORT to terminate all processes defined by the MPI COMM WORLD commu-
nicator. (See the OASIS3 or OASIS3 MCT user guide for details.)

In practice, the UM has no need for these as it is able to shut down all components via the GCOM call GC -
ABORT, provided GCOM version 4.6 or later is employed. This operation acts on MPI COMM WORLD, not just
the local internal UM communicator.

NEMO base code, at least up until version 3.6, simply performs MPI FINALIZE or even just a FORTRAN STOP,
both of which are undesirable as they will simply leave other concurrent components deadlocked. For prefer-
ence, in coupled model systems, NEMO should employ suitable code modifications to employ prism abort proto
(applicable to OASIS3 and OASIS3 MCT) or oasis abort (applicable to OASIS3 MCT only).

CICE base code tends to employ MPI ABORT with the MPI COMM WORLD communicator via the ice exit rou-
tine and as such would normally be sufficient to shut down all components cleanly. However, as with NEMO,
there are instances of the use of “STOP” in the CICE code which should be replaced with calls to ice exit to
ensure a clean and swift shut down.

Typically, it is intended that suitable code modifications will be provided via standard configuration branches to
ensure that NEMO and CICE shut down all components cleanly and simultaneously if necessary.

14 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

It is recommended that each new component introduced to a MPMD coupled system be checked and modified
accordingly to ensure that if an error condition is encountered and the component wishes to stop, then it is able
to halt all other components at the same time.

15 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

17 Figures

17.1 Figure 1. Top Level Atmos - NEMO-CICE Ocean coupling code

um_main
|
- UM_SHELL (initialise addresses, pointers)
|
|- OASIS_INITIALISE
| |
| - OASIS3_ATMOS_INIT (Initialise )
| |
| - OASIS_READ_TRANSLIST (Read namelist of required coupling fields)
|
|- OASIS3_GRID (Synchronisation call for IO server processors only)
|
|- U_MODEL
| |
| |- INITIAL
| | |- INITDUMP (Read dump)
| | - OASIS_INITIALISE_2
| | |
| | |- OASIS3_GRID (Main call for non IO server processors only.
| | | Set up grid and coupling field definitions)
| | - OASIS_INITA2O (Initialise coupling field pointers
| | | and allocate arrays for coupling fields)
| | |
| | |- OASIS_PROCESS_TRANSLIST (Set polar meaning flags
| | | filed by field.)
| | |
| | - OASIS_POINT_TRANSLIST (Set up pointers from generic
| | coupling field arrays to specific
| | local coupling fields.)
| |<=== Main timestep loop begins
| |
| |- OASIS3_GETO2A (Receive incoming fields from coupler)
| |- OASIS3_PUTA2O (Send outgoing fields to coupler)
| |- OASIS3_ADVANCE_DATE (Increment coupler date/time)
| |- ATM_STEP
| |- OASIS_UPDATECPL (Move coupling fields to prognostics for restartability)
| |- DUMPCTL (Write dump if required)
| |
| |<=== Main timestep loop ends
| |
| |_ OASIS_TIDY (Deallocate arrays used in coupling)
|
|_ OASIS_FINALISE (Shut down coupler and MPI)

16 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

17.2 Figure 2. Sequence of events in coupled model

Initialise Atmosphere

|
|
UM atmos_______v______________________________________________________________________End
| ^
| Timestep control for Atmosphere |
| |
v______________________________________|
^ | |
| | ----------> Write dump
| v if required
Exchange coupling
OASIS__________________________ fields at start of_____________________________________End
timestep through
coupler
^ |
| | ---------> Write dump
___|_v______________________________|_ if required
Initialise NEMO-CICE ^ |
| | Timestep control for NEMO-CICE |
| | |
NEMO-CICE_______v________________|______________________________________v_______________End

17 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

18 Tables

18.1 Table 1. Coupling CPP keys in the UM

Compile #if defined( ):


• OASIS3: Implement coupling using OASIS or OASIS3-MCT interface.
• MCT: Complementary to the OASIS3 key above. Implement coupling code specific to OASIS3-MCT inter-
face.
Run-time logicals and related variables
• L OASIS: TRUE if OASIS3 or OASIS3-MCT coupling is used.
• L CTILE: Coastal tiling.
• L COUPLE MASTER: Perform coupling via master PE, using full global fields - implies explicit gathering
and scattering in UM. When false implies gathering/scattering is done by the coupler.
• OASIS COUPLE FREQ: Coupling frequency in hours. This must me an integer - i.e. the minimum cou-
pling frequency is 1 hour.
• L OASIS ICECALVE: An iceberg calving ancillary file will be read in and included in the freshwater and
latent heat fluxes.
• L OASIS TIMERS: Use UM timers to measure coupling operations elapsed times. Synchronisation is
enforced in entrance and exit of put and get operations.

18 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

18.2 Table 2. NEMO-CICE Ocean to atmosphere coupling fields

Fields generated by the NEMO-CICE ocean submodel received as coupling fields into the UM atmosphere
model. These fields are typically instantaneous values, though there is no technical restriction on the use of
meaned values.

Coupling NEMO field name or component de- UM STASH UM STASH


field scription Item num- Item num-
ber without ber with
coastal coastal
tiling tiling
Sea surface tn (vn3.0,vn3.2) jps toce (vn3.4) 24 507
tempera-
ture
Zonal sea un send (vn3.0,vn3.2) jps ocx1 (vn3.4) 28 28
currents (output field derived from ocean/ice
currents rotated onto a geographic
grid)
Meridional vn send (vn3.0,vn3.2) jps ocy1 (vn3.4) 29 29
sea cur- (output field derived from ocean/ice
rents currents rotated onto a geographic grid
Sea ice naicetn (vn3.0,vn3.2) jps fice (vn3.4) 413 413
fraction -
multi-cat
Sea ice cpl hicen (vn3.0,vn3.2) jps hice 414 414
depth - (vn3.4)
multi-cat
Snowdepth cpl snowthickn (vn3.0,vn3.2) jps hsnw 416 416
- multi-cat (vn3.4)

19 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

18.3 Table 3. Atmosphere to ocean coupling fields

Diagnostic fields generated by the atmosphere submodel. These fields are typically daily mean values, though
there is no technical restriction on the use of instantaneous values.

Coupling field UM STASH code without UM STASH code with NEMO field name or
Coastal Tiling Coastal Tiling component description
Iceberg calving field 0,190 0,190 Added to riverout and
heatflux in the atmo-
sphere model.
MSLP 0,193 0,193 jpr mslp (requires NEMO
code change at present).
Surface u wind stress 3,219 3,392 um tx in u (vn3.0,vn3.2)
components jpr otx1 (vn3.4) (subse-
quently rotated to NEMO
grid w/ um ty in v / jpr -
oty1)
Surface v wind stress 3,220 3,394 um ty in v (vn3.0,vn3.2)
components jpr oty1 (vn3.4) (sub-
sequently rotated to
NEMO grid w/ um tx in u
/ jpr otx1)
10m wind 3,230 3,230 jpr w10m (vn3.4) (Not re-
quired for vn3.2)
Net downward inte- 1,203 1,203 Component of heat
grated solar radiation flux calculation => qt
(vn3.0,vn3.2) / heat flux
component (vn3.4)
Net downward penetra- 1,204 1,260 qsr (vn3.0,vn3.2) (Not re-
tive solar radiation quired for vn3.4)
Net downward long- 2,203 2,203 Component of heat
wave radiation flux calculation => qt
(vn3.0,vn3.2) / heat flux
component (vn3.4)
Surface evaporation 3,232 3,232 evap, also component of
weighted by leads heat flux calculation => qt
(vn3.0,vn3.2) / heat flux
component (vn3.4)
Sensible heat flux over 3,228 3,228 Component of heat flux
open sea calculation => qt / heat
flux component (vn3.4)
Sublimation rate 3,231 N/A Component of latent
flux calculation with-
out coastal tiling =>
qla ice (vn3.0,vn3.2) /
Evaporation over ice
Sublimation rate N/A 3353 Component of latent
flux calculation with
coastal tiling and sin-
gle ice category =>
qla ice (vn3.0,vn3.2) /
Evaporation over ice
Sublimation rate N/A 3509 Component of latent
flux calculation with
coastal tiling and multi
ice category => qla ice
(vn3.0,vn3.2) / Evapora-
tion over ice
Continued on next page

20 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

Table3. Continued from previous page


Coupling field UM STASH code without UM STASH code with NEMO field name or
Coastal Tiling Coastal Tiling component description
Bottom melting of sea 3,256 3,256 fcondtop (vn3.0,vn3.2) /
ice jpr botm (vn3.4) (without
sea-ice multi layers)
Bottom melting of sea 3,510 3,510 fcondtop (vn3.0,vn3.2)
ice / jpr botm (vn3.4) (with
sea-ice multi layers)
Top melting of sea ice 3,257 3,257 topmeltn (vn3.0,vn3.2) /
jpr topm (vn3.4)
Large scale rainfall rate 4,203 4,203 Component of rprecip
(vn3.0,vn3.2) / liquid
precipitation (vn3.4)
Large scale snowfall 4,204 4,204 Component of sprecip
rate (vn3.0,vn3.2) / solid
precipitation (vn3.4)
Convective rainfallrate 5,205 5,205 Component of rprecip
(vn3.0,vn3.2) / liquid
precipitation (vn3.4)1
Convective snowfallrate 5,206 5,206 Component of sprecip
(vn3.0,vn3.2) / solid
precipitation (vn3.4)2
River outflow fresh wa- 26,004 26,004 runoff (vn3.0,vn3.2) /
ter into coastal ocean Runoffs (vn3.4)
outflow points

1 Not used in fine-grid models such as the UKV, which explicitly resolve convection, as controlled by the switch l param conv in the

namelist nlstcatm: .TRUE. to parametrize, .FALSE. not to.


2 Footnote 1 applies here, also.

21 c Crown Copyright 2015



UMDP: C02
Coupled Models Technical Overview

19 References

• S. Valcke, 2008: OASIS3 User Guide (OASIS3 3). PRISM Support Initiative No 3, 68 pp.
• S. Valcke, T. Craig, L. Coquart, May 2013: OASIS3-MCT User Guide (OASIS3-MCT 2.0). CERFACSCNRS,
54 pp.
• Madec, G., Delecluse P., Imbard M., and Lévy C., 1998: OPA 8.1 Ocean General Circulation Model
reference manual. Note du Pole de modélisation, Institut Pierre-Simon Laplace (IPSL), France, No11,
91pp.
• Madec G. 2008: “NEMO ocean engine”. Note du Pole de modélisation, Institut Pierre-Simon Laplace
(IPSL), France, No 27 ISSN No 1288-1619.
• NEMO web site: http://www.nemo-ocean.eu/
• Los Alamos Sea Ice Model (CICE) web site: http://gcmd.nasa.gov/records/LANL-CICE.html

22 c Crown Copyright 2015

You might also like