Coupling OASIS CROCO

Download as pdf or txt
Download as pdf or txt
You are on page 1of 42

Documentation for coupling with OASIS

in CROCO, WRF, WW3


Swen JULLIEN, Gildas CAMBON
March 7, 2018

Contents

1 Coupling: how does it work? 3


1.1 What is OASIS? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.2 OASIS functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Coupling sequence . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.4 Interpolations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.5 Coupled variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.6 Detailed OASIS3-MCT implementation in each code . . . . . . . . 8
1.6.1 In CROCO . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.6.2 In WW3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.6.3 In WRF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2 Working architecture 13

3 Download 14
3.1 OASIS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.2 CROCO . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.3 WW3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.4 WRF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

4 Coupling tools scripts provided 16


4.1 Coupling tools contents . . . . . . . . . . . . . . . . . . . . . . . . 16
4.2 Coupling tools usage . . . . . . . . . . . . . . . . . . . . . . . . . . 18

5 Compile 19
5.1 Set up you environment . . . . . . . . . . . . . . . . . . . . . . . . 19
5.2 Tips in case you encounter errors during compilation . . . . . . . . 19
5.3 Few information on compilation options . . . . . . . . . . . . . . . 20
5.4 OASIS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
5.5 CROCO . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
5.6 WW3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
5.7 WRF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.7.1 Uncoupled compilation . . . . . . . . . . . . . . . . . . . . . 25
5.7.2 Coupled compilation . . . . . . . . . . . . . . . . . . . . . . 25

1
6 Pre-processing before run 27
6.1 CROCO . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
6.1.1 Simple climatological configuration setup . . . . . . . . . . 27
6.1.2 Nested configuration setup . . . . . . . . . . . . . . . . . . 28
6.1.3 Rivers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
6.1.4 Interannual simulations . . . . . . . . . . . . . . . . . . . . 30
6.2 WW3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
6.3 WRF - WPS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

7 Run 36
7.1 Uncoupled run . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
7.1.1 CROCO . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
7.1.2 WW3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
7.1.3 WRF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
7.2 Coupled run . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
7.2.1 OASIS input files . . . . . . . . . . . . . . . . . . . . . . . . 38
7.2.2 CROCO inputs . . . . . . . . . . . . . . . . . . . . . . . . . 40
7.2.3 WW3 inputs . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
7.2.4 WRF inputs . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
7.2.5 Tips in case of error during a coupled run . . . . . . . . . . 41

8 Example files 42

2
1 Coupling: how does it work?
1.1 What is OASIS?

WRF, WW3, and CROCO are coupled through the OASIS-MCT (Ocean-Atmosphere-
Sea-Ice-Soil, Model Coupling Toolkit) coupler developed by CERFACS (Toulouse,
France). This coupler allows the atmospheric, oceanic and wave models to run
at the same time in parallel, it exchanges variables and performs grid interpo-
lations and time transformations if requested. OASIS is not an executable file,
but a set of libraries providing functions which are called in the models them-
selves. The variables exchanged by the coupler as well as the grid interpolations
are specified through a namelist file (called ”namcouple”).

OASIS-MCT libraries are:

• psmile for coupling

• mct (Argonne National Laboratory) for parallel exchanges

• scrip (Los Alamos National Laboratory) for interpolations

1.2 OASIS functions

Functions provided by the OASIS-MCT framework are:

• Initialization and creation of a local communicator for internal parallel


computation in each model (note: oasis / prism are new / old names for
backward compatibility, both useable)

– oasis_init_comp / prism_init_comp_proto
– oasis_get_localcomm / prism_get_localcomm_proto

• Grid data definition for exchanges and interpolations

– oasis_write_grid
– oasis_write_corner
– oasis_write_area
– oasis_write_mask
– oasis_terminate_grids_writing

• Partition and exchanged variables definition

– oasis_def_partition / prism_def_partition_proto
– oasis_def_var / prism_def_var_proto
– oasis_enddef / prism_enddef_proto

• Exchange of coupling fields

– oasis_get / prism_get_proto
– oasis_put / prism_put_proto

3
• Finalization

– oasis_terminate / prism_terminate_proto

These OASIS3-MCT intrinsic functions are called in each model involved in the
coupling. Initialization phase, Definition phase, and Finalization phase are called
only once in each simulation while Exchange phase is called every time step.
The effective values exchange is done only at specified timing, depending on the
coupling frequency, although the Exchange phase is called every model time step.
The coupling frequency is controlled through the OASIS3-MCT namcouple.

1.3 Coupling sequence

A typical coupled run will be defined by a coupling time step, which will be
used to exchange the fields between the models. To work correctly, it must be a
multiple of the models time step. An example of coupling sequence is pictured
in Fig. 1. In this example, the coupling time step is defined at 360s for both
models. The wave model time step is 90s, so it will exchange every 4 time steps.
The ocean model time step is 180s, so it will exchange every 2 time steps.
Another coupling parameter defined in the namcouple is the lag. It is used by
the OASIS coupler to synchronize the send and receive functions. The lag must
be defined for each model at the same value than its own time step. For instance:
• WAVE to OCEAN lag = dt wave = 90

• OCEAN to WAVE lag= dt ocean = 180


Therefore, receive and send functions have to be set at the same time in the model
codes. OASIS will send the fields at the appropriate time thanks to the lag defined
in the namcouple.
The coupling sequence in each model is:
initialization: oasis_time = 0
reception of coupled fields: rcv(oasis_time)
model time stepping: computation t -> t+dt
sending of coupled fields snd(oasis_time)
increment of coupling time oasis_time=oasis_time+dt

OASIS will exchange fields (get/put) if the time correspond to a coupling time
step, e.g., if:
• oasis_time corresponds to a coupling time step for get

• oasis_time+lag corresponds to a coupling time step for put

IN THE MODEL IN OASIS


receive(date) get(date)
send(date) put(date+lag)

OASIS is also able to store fields from a model if a time transformation is re-
quested in the namcouple (keyword LOCTRANS + type of transformation, see
next section). OASIS will store the fields until a coupling time step is reached,

4
then it will apply the time transformation, interpolate spatially the field as spec-
ified in the namcouple, and exchange the field to the other model.

Details about the first exchange: as reception of coupled fields is called before
model computation, you need to create “restart files” for the coupler containing
initial fields to exchange with names corresponding to OASIS namcouple coupled
fields. The initial files for OASIS are named oasis_oce.nc and oasis_wave.nc
in the example of Fig. 1 (oce_ini and wave_ini are not related to OASIS, they
are usual initialization or restart files from your oceanic and wave model; e.g., in
CROCO, oce_ini is croco_ini.nc, and in WW3, wave_ini is restart.ww3).

OW  COUPLING  –  EXAMPLE  –  LOCTRANS  opera=on  on  fields  before  exchange  

dt  wave  =  90s        WAVE  =>  OCE  lag  =  dtwave  =  90s    


dt  oce  =  180  s        OCE  =>  WAVE  lag  =  dtoce  =  180  s    
dt  coupling  =  360s  for  both  models    Fields  exchanged  with  a  LOCTRANS  opera=on  

WAVE   0                      90                      180                  270                        360                450                  540                    630                      720                                      3240                  3330                3420              3510                3600  
wave_ini  

oasis_wave.nc   oasis_wave.nc  
   
  LOCTRANS  operaJon  =>   LOCTRANS  operaJon  =>   LOCTRANS  operaJon  =>    
   
OASIS    
0                                                                                                                          360                                                                                                                          720                                      3240                                                                                                                      3600  
   
   
   
LOCTRANS  operaJon   LOCTRANS  operaJon  =>   LOCTRANS  operaJon  =>  
   
=>  
   
oasis_oce.nc   oasis_oce.nc  

①      ③      ④      (   )   ⑥     
oce_ini  
OCE  
0                    ② 
                             180                    ⑤ 
                             360                                                            540                                                      720                                    3240                                                    3420                                                3600  

advance  model  Jme  step   model  Jme  line  


send  model  field  to  the  coupler    
receive  ocean  field  from  the  coupler   OASIS  Jme  lines  for  ocean  and  wave  fields  
receive  wave  field  from  the  coupler    
 

Figure 1: Coupled sequence schematic picture

Summary of restart files:

• oasis_oce.nc, oasis_wave.nc => restart files for OASIS, you need to


create them at the beginning of the run, OASIS will overwrite them at the
end of the run and they will be available for restart

• oce_ini, wave_ini correspond to croco_ini.nc, restart.ww3 => your


ocean and wave model initial or restart files

5
Practical example of the coupling sequence pictured in Fig. 1:

oasis_time = 0
1 - rcv(0) => in oasis: get(0)
#1 => get field from oasis_wave.nc
2 - t = 0+dt = 0+180 = 180
#2 => timestepping
3 - snd(0) => in oasis: put(0+lag) = put(0+180) = put(180)
#3 => 180 is not a coupling time step, do nothing
oasis_time = oasis_time+dt = 0+180 = 180
4 - rcv(180) => in oasis: get(180)
#4 => 180 is not a coupling time step, do nothing
5 - t = 180+dt = 180+180 = 360
#5 => timestepping
6 - snd(180) => in oasis: put(180+lag) = put(180+180) = put(360)
#6 => 360 is a coupling time step, put field

1.4 Interpolations

The OASIS3-MCT coupler can process time transformation and 2D spatial inter-
polation on the exchanged fields.
The 2D spatial interpolation, requested if models have different grids, is per-
formed by the scrip library using SCRIPR keyword in the namcouple. Available
interpolation types are:

• BILINEAR performs an interpolation based on a local bilinear approxima-


tion

• BICUBIC performs an interpolation based on a local bicubic approximation

• CONSERV performs 1st or 2nd order conservative remapping

• DISTWGT performs a distance weighted nearest-neighbour interpolation


(N neighbours)

• GAUSWGT performs a N nearest-neighbour interpolation weighted by their


distance and a gaussian function

See OASIS manual for detailed informations.

Time transformations can be performed by OASIS using LOCTRANS keywork in


the namcouple. Available transformations are:

• INSTANT: no time transformation, the instantaneous field is transferred

• ACCUMUL: the field accumulated over the previous coupling period is ex-
changed

• AVERAGE: the field averaged over the previous coupling period is trans-
ferred

• T_MIN: the minimum value of the field for each source grid point over the
previous coupling period is transferred

• T_MAX: the maximum value of the field for each source grid point over the
previous coupling period is transferred

6
1.5 Coupled variables

The variables that can be exchanged are:

CROCO to WRF and WW3:


# Zonal current velocity
0.5*(u(1:Lmmpi ,1:Mmmpi,N,nnew)+u(2:Lmmpi+1,1:Mmmpi,N,nnew)) # units m/s

# Meridional current velocity


0.5*(v(1:Lmmpi,1:Mmmpi ,N,nnew)+v(1:Lmmpi,2:Mmmpi+1,N,nnew)) # units m/s

CROCO to WRF:
# SST - in the code:
t(1:Lmmpi,1:Mmmpi,N,nnew,itemp) + 273.15 # units K

CROCO to WW3:
# Water level (e.g. sea surface heigth)
zeta(1:Lmmpi,1:Mmmpi,nnew) # units m

WW3 to WRF:
# Charnock coefficient
CHARN # units adimensional

WW3 to CROCO:
# Significant wave height
HS # units m

# Mean wave period


T0M1 # units s

# Cosine of mean wave direction


COS(THM) # trigonometric convention

# Sine of mean wave direction


SIN(THM) # trigonometric convention

# Zonal wind stress to the waves


TAUWIX # units m2/s2

# Meridional wind stress to the waves


TAUWIY # units m2/s2

# Zonal wave stress to the ocean


TAUOX # units m2/s2

# Meridional wave stress to the ocean


TAUOY # units m2/s2

WRF to CROCO:
# Net surface solar heat flux (short-wave flux) - in the code:
GSW # units W.m-2

# Net surface non-solar heat flux (long-wave-latent-sensible) - in the code:


GLW-STBOLT*EMISS*SST**4-LH-HFX # units W/m2

# Evaporation-precipitation flux - in the code:


QFX-(RAINCV+RAINNCV)/DT # units kg.m-2.s-1 == mm.s-1

# Wind stress module - in the code:


taut=rho*ust**2 # units N/m

7
# Zonal wind stress - in the code:
taut*(u_phy-uoce)/sqrt((u_phy-uoce)**2+(v_phy-voce)**2) # units N/m2

# Meridional wind stress - in the code:


taut*(v_phy-voce)/sqrt((u_phy-uoce)**2+(v_phy-voce)**2) # units N/m2

WRF to WW3:
# Zonal wind speed at first level - in the code:
u_phy-uoce # units m/s

# Meridional wind speed at first level - in the code:


v_phy-voce # units m/s

1.6 Detailed OASIS3-MCT implementation in each code

1.6.1 In CROCO
The following routines are specifically built for coupling with OASIS and contain
calls to OASIS intrinsic functions:

• cpl_prism_init.F : Manage the initialization phase of OASIS3-MCT : lo-


cal MPI communicator

• cpl_prism_define.F : Manage the definition phase of OASIS3-MCT: do-


main partition, name of exchanged fields as read in the namcouple

• cpl_prism_grid.F : Manage the definition of grids for the coupler

• cpl_prism_put.F : Manage the sending of arrays from CROCO to the OASIS3-


MCT coupler

• cpl_prism_getvar.F : Manage the generic reception from OASIS3-MCT.

• cpl_prism_get.F : Manage the specificity for each received variable: C-


grid position, and field units transformations

They are then called in the following routines of the code:

• main.F : Initialization, and finalization phases

• get_initial.F : Definition phase

• zoom.F : Initialization phase for AGRIF nested simulations

• step.F : Exchanges (sending and reception) of coupling variables

Other CROCO routines that have been modified to introduce coupling:

• testkeys.F : To enable automatic linking to OASIS3-MCT libraries during


compilation with jobcomp

• cppdefs.h : Definition of the OA_COUPLING and OW_COUPLING cpp-keys,


and the other related and requested cpp-keys, as MPI

8
• set_global_definitions.h : Definition of cpp-keys in case of coupling
(undef OPENMP, define MPI, define MPI_COMM_WORLD ocean_grid_comm:
MPI_COMM_WORLD generic MPI communicator is redefined as the local MPI
communicator ocean_grid_comm, undef BULK_FLUX : no bulk OA parametriza-
tion)

• mpi_roms.h : Newly added to define variables related to OASIS3-MCT op-


erations. It manage the MPI communicator, using either the generic
MPI_COMM_WORLD, either the local MPI communicator created by OASIS3-
MCT

• read_inp.F : Not reading atmospheric forcing files (croco_frc.nc and/or


croco_blk.nc) in OA coupled mode

9
A schematic picture of the calls in CROCO is:
# main.F
if !defined AGRIF
call cpl_prism_init
else
call Agrif_MPI_Init
endif
...
call read_inp
...
call_get_initial
# get initial.F
...
call cpl_prism_define
# cpl prism define.F
call prism_def_partition_proto
call cpl_prism_grid
call prism_def_var_proto
call prism_enddef_proto
oasis_time=0
# main.F
...
DO 1:NT
call step
# step.F
if ( (iif==-1).and.(oasis_time>=0).and.(nbstep3d<ntimes) ) then
call cpl_prism_get(oasis_time)
# cpl prism get.F
call cpl_prism_getvar
endif
call prestep3d
call get_vbc
...
call step2d
...
call step3d_uv
call step3d_t
iif = -1
nbstep3d = nbstep3d + 1
if (iif==-1) then
if (oasis_time>=0.and.(nbstep3d<ntimes)) then
call cpl_prism_put (oasis_time)
oasis_time = oasis_time + dt
endif
endif
# main.F
END DO
...
call prism_terminate_proto
...

1.6.2 In WW3
The following routines have been specifically built for coupling with OASIS:
• w3oacpmd.ftn : main coupling module with calls to oasis intrinsic func-
tions
• w3agcmmd.ftn : module for coupling with an atmospheric model
• w3ogcmmd.ftn : module for coupling with an ocean model
The following routines have been modified for coupling with OASIS:
• w3fldsmd.ftn : routine that manage input fields (and therefore received
fields from the coupler
• w3wdatmd.ftn : routine that manage data structure for wave model and
therefore time for coupling

10
• w3wavemd.ftn : actual wave model, here is placed the sending of coupled
variables

• ww3_shel.ftn : main routine managing the wave model, definition/initial-


isation/partition phases are placed here

A schematic picture of the calls in WW3 is given in Fig. 2.

Figure 2: Schematic picture of coupling implementation in WW3

1.6.3 In WRF
The routines specifically built for coupling are:

• module_cpl_oasis3.F

• module_cpl.F

Implementation of coupling with the ocean implies modification in the following


routines:

• phys/module_bl_mynn.F

• phys/module_bl_ysu.F

• phys/module_pbl_driver.F

• phys/module_surface_driver.F

• phys/module_sf_sfclay.F

• phys/module_sf_sfclayrev.F

Implementation of coupling with waves implies modifications in the following


routines:
Regristry/Registry.EM_COMMON # CHA_COEF added
dyn/ module_first_rk_step_part1 .F # CHA_COEF=grid \% cha_coef
declaration added
frame/module_cpl.F # rcv CHA_COEF added
phys/ module_sf_sfclay .F and ... _sfclayrev.F # introduction of wave
coupled case: isftcflx =5 as follows:
! SJ: change charnock coefficient as a function of waves , and
hence roughness
! length

11
IF ( ISFTCFLX.EQ.5 ) THEN
ZNT(I)=CHA_COEF(I) * UST(I) * UST(I)/G+0.11 * 1.5E -5/ UST(I)
ENDIF

phys/ module_surface_driver .F # CHA_COEF added in calls to sfclay


and sfclayrev and "CALL cpl_rcv" for CHA_COEF

Schematic picture of WRF architecture and calls to the coupling dependencies:

# main/wrf.F
CALL wrf_init
# main/module wrf top.F
CALL wrf_dm_initialize
# frame/module dm.F
CALL cpl_init( mpi_comm_here )
CALL cpl_abort( ’wrf_abort’, ’look for abort message in rsl* files’ )
CALL cpl_defdomain( head_grid )

# main/wrf.F
CALL wrf_run
# main/module wrf top.F
CALL integrate ( head_grid )
# frame/module integrate.F
CALL cpl_defdomain( new_nest )
CALL solve_interface ( grid_ptr )
# share/solve interface.F
CALL solve_em ( grid , config_flags ... )
# dyn em/solve em.F
curr_secs2 # time for the coupler
CALL cpl_store_input( grid, config_flags )
CALL cpl_settime( curr_secs2 )
CALL first_rk_step_part1
# dyn em/module first rk step part1.F

CALL surface_driver( ... )


# phys/module surface driver.F
CALL cpl_rcv( id, ... )
u_phytmp(i,kts,j)=u_phytmp(i,kts,j)-uoce(i,j)
v_phytmp(i,kts,j)=v_phytmp(i,kts,j)-voce(i,j)

CALL SFCLAY( ... cha_coef ...)


# phys/module sf sfclay.F
CALL SFCLAY1D
IF ( ISFTCFLX.EQ.5 ) THEN
ZNT(I)=CHA_COEF(I)*UST(I)*UST(I)/G+0.11*1.5E-5/UST(I)
ENDIF

CALL SFCLAYREV( ... cha_coef ...)


# phys/module sf sfclayrev.F
CALL SFCLAYREV1D
IF ( ISFTCFLX.EQ.5 ) THEN
ZNT(I)=CHA_COEF(I)*UST(I)*UST(I)/G+0.11*1.5E-5/UST(I)
ENDIF

CALL pbl_driver( ... )


# phys/module pbl driver.F
CALL ysu( ... uoce,voce, ... )
module bl ysu.F
call ysu2d ( ... uox,vox, ...)
wspd1(i) = sqrt( (ux(i,1)-uox(i))*(ux(i,1)-uox(i))
+ (vx(i,1)-vox(i))*(vx(i,1)-vox(i)) )+1.e-9
f1(i,1) = ux(i,1)+uox(i)*ust(i)**2*g/del(i,1)*dt2/wspd1(i)
f2(i,1) = vx(i,1)+vox(i)*ust(i)**2*g/del(i,1)*dt2/wspd1(i)

CALL mynn_bl_driver( ... uoce,voce, ... )


module bl mynn.F
d(1)=u(k)+dtz(k)*uoce*ust**2/wspd
d(1)=v(k)+dtz(k)*voce*ust**2/wspd

# dyn em/solve em.F


CALL first_rk_step_part2
# frame/module integrate.F
CALL cpl_snd( grid_ptr )

12
# Check where this routine is called...
# frame/module io quilt.F # for IO server (used with namelist variable: nio tasks per group
CALL cpl_set_dm_communicator( mpi_comm_local )
CALL cpl_finalize()

# main/wrf.F
CALL wrf_finalize
#main/module wrf top.F
CALL cpl_finalize()

2 Working architecture
The following working architecture is suggested (and used in the scripts provided
in Coupling tools directory of the croco tools):
HOME # model sources:
- wrf
- croco
- ww3
- oasis
# configuration settings:
- CONFIGS - YOURCONFIG - wrf_in
- inputs_wrf
- ww3_in
- inputs_ww3
- croco_in
- oasis_in
- inputs_oasis
- toy_in

WORKDIR # config. input and output files , working dir.:


- CONFIGS - YOURCONFIG - wrf_files
- WPS_DATA
- ww3_files
- croco_files
- oasis_files
- toy_files
- outputs_wps
- outputs_real
- outputs_frc_croco
- outputs_cpl_ow
- ...

# data directory :
- DATA - CFSR_grib
- CFSR_netcdf
- DATASETS

13
3 Download
3.1 OASIS

OASIS3-MCT is now available in version 3. It is mandatory to run CROCO in


coupled mode because CROCO call the oasis grid generation function in parallel
mode (which is available only since version 3). If you want to use an older ver-
sion, you need to create your grids.nc and masks.nc files first, and comment the
call to cpl prism grids in cpl prism define.F

To download OASIS3-MCT version 3:


cd oasis
svn checkout https :// oasis3mct.cerfacs.fr/svn/branches/OASIS3 -
MCT_3 .0 _branch

3.2 CROCO

First you need to request an account on the Gitlab (only if not already registered):
https://gitlab-account.inria.fr
Once login, request an access to the project:
• For CROCO: https://gitlab.inria.fr/croco-ocean/croco

• For CROCO TOOLS: https://gitlab.inria.fr/croco-ocean/croco_tools


Add your machine ssh rsa-key to your Gitlab account: click on your login avatar,
then Settings, and on the left: ”SSH keys”. If you do not have a rsa-key, first
create it on your machine:
cd .ssh
ssh -keygen -t rsa

Finally get the code:


• For CROCO:

– Log in to: https://gitlab.inria.fr/croco-ocean/croco


– See infos there: https://gitlab.inria.fr/croco-ocean/croco/wikis/
home

• For CROCO TOOLS

– Log in to: https://gitlab.inria.fr/croco-ocean/croco_tools


– See infos there: https://gitlab.inria.fr/croco-ocean/croco_tools/
wikis/home

mkdir $HOME/croco
cd $HOME/croco
git clone https :// gitlab.inria.fr/croco -ocean/croco.git
mkdir $HOME/croco_tools
cd $HOME/croco_tools
git clone https :// gitlab.inria.fr/croco -ocean/croco_tools.git

14
3.3 WW3

cd ww3
svn checkout --username USERNAME https :// forge.ifremer.fr/svn/ww3/
trunk

3.4 WRF

Download WRF from: http://www2.mmm.ucar.edu/wrf/users/download/get_


source.html. Copy the tar file on your cluster, then:
mv WRFV3 .7.1. TAR.gz wrf/.
cd wrf
tar -zxvf WRFV3 .7.1. TAR.gz
mv WRFV3 WRFV3 .7.1

15
4 Coupling tools scripts provided
Coupling tools scripts are provided in croco tools. They have been developed to
help you building coupled configuration with CROCO, WRF and WW3.

4.1 Coupling tools contents

* create config
script to create your configuration directory tree
and copy all the following scripts and files in the appropriate dir.

* scripts run
run env
source script to define a set of useful environment variables for your
run needs to be edited carefully when you start working on your
configuration it is called at the begining of all other scritps!
run cpl
script to launch a coupled run
run frc croco
run frc wrf
run frc ww3
scripts to launch each model in forced mode

* scripts croco
readme download CFSR
Process CFSR files for CROCO.sh
readme and script to download and pre-process CFSR file to be
used for CROCO ONLINE interp BULK
readme preprocess
crocotools param.m
start.m
readme and scripts to use croco tools classic pre-processing
(in matlab)
make ww3 grd input files from croco grd.m
script to generate coord. and bathy. file for WW3 from
croco grd.nc file
cppdefs.h.frc
cppdefs.h.frc.cfsr.online
cppdefs.h.oa
cppdefs.h.ow
cppdefs.h.owa
examples of cppdefs.h files for the different forced or coupled cases.
They will be linked in cppdefs.h during compilation (make CROCO compil)
make CROCO compil
script to define some variables and launch CROCO jobcomp
croco.in.base
CROCO namelist file in which some stuff (timestep, etc) will be replaced
by run frc croco or run cpl scripts

* scripts ww3
make WW3 compil
script to define some variables and launch WW3 compilation
script make CFSR wind for ww3.sh
script make WRF wind for ww3.sh
script make CROCO current and level for ww3.sh
UV2T.sh
scripts to create wind, current, and level input files for WW3

inputs ww3
switch aw BENGUELA
switch ow BENGUELA
switch owa BENGUELA
switch frc BENGUELA
switches for the different modes
ww3 grid.inp.base
grid input file in which some stuff (timesteps, etc) will be
replaced by run frc ww3 or run cpl scripts
ww3 prnc.inp.wind
ww3 prnc.inp.level
ww3 prnc.inp.current
prnc input file for prerpating ww3 input files

16
ww3 strt.inp
strt input file for running ww3 strt
ww3 shel.inp.base.frc
ww3 shel.inp.base.Afrc
ww3 shel.inp.base.ow
ww3 shel.inp.base.aw
ww3 shel.inp.base.owa
shel input files for the different modes in which some stuff
(dates, etc) will be replaced by run frc ww3 or run cpl scripts
ww3 ounf.inp.base
ounf input file in which dates will be replaced

* scripts wrf
readme download CFSR data
readme wps
readme.Vtable
some useful readme for WPS
configure.namelist.wps BENGUELA
configure file to edit for running WPS and real
run wps.bash
script to run wps (wrf pre-processing)
run real.bash
script to run real (wrf pre-processing)
make WRF compil
script to compile wrf
inputs wrf
configure.wrf.coupled
configure.wrf.uncoupled
examples of configure files for compiling wrf
in forced and coupled modes
readme.Vtable
Vtable.CFSR sfc flxf06
Vtable.CFSR press pgbh06
Vtables for CFSR data
namelist.input.base.complete
namelist base in which stuff will be replaced when running real
namelist.input.prep.BENGUELA.aw
namelist example for a coupled AW run
README.namelist r
eadme to know all the namelist options available
myoutfields.txt
example of file that can be prescribed in wrf namelist to
add/remove variable outputs

* scripts oasis
create oasis grids for wrf.sh
script to create grids.nc and masks.nc files for OASIS for WRF
(call the the oasis function not implemented in WRF yet).
Will be called in run cpl
create oasis restart from calm conditions.sh
create oasis restart from preexisting output files.sh
from croco.sh
from ww3.sh
from wrf.sh
to wrf stag grid.sh
scripts and associated functions that will be called
to create restart files for OASIS from calm or
pre-existing model files. Can be called in run cpl.
create oasis toy files.sh
script to create files that will be used by the toy
model to mimic another model
inputs oasis:
make.ada my
make.DATARMOR
example files for OASIS compilation on
ADA and DATARMOR clusters
namcouple.base.aw or oa or owa or ow
namelist files for the different coupled modes
in which stuff will be replaced by run cpl script
namcouple.base.aw.debug
example of a namelist file with debug options
namcouple.base.aw.nointerp
example of namlist with given interpolation file
namcouple.base.aw.toyatm
namelist files for the different coupled modes in
which stuff will be replaced by run cpl script

17
when running with a toy model

toy model
toy model sources and namelists. This toy model can be used to test
coupling, it mimics a model by exchanging prescribed fields with oasis.
For example if you are using CROCO and want to couple it with a wave
model, you should first try to couple it with the toy wave model.

4.2 Coupling tools usage

• First you have to create your configuration file system: edit and run cre-
ate config

• Then create your environment: edit and source run env

• Compile OASIS first and then your models: detailed explanations are given
below. Some scripts are also provided to help you:

– in oasis in: examples of make.YOURMACHINE


– in croco in: make CROCO compil and cppdefs.h files for the different
cases
– in ww3 in: make WW3 compil
– in wrf in: make WRF compil and configure.wrf examples for the dif-
ferent cases

• Pre-processing for each model: some scripts are also provided

– in croco in: Process CFSR files for CROCO.sh, start.m and crocotools param.m
– in ww3 in: script make XXXX for ww3.sh and make ww3 grd input files from croco
in croco in
– in wrf in: readme wps, run wps.bash, run real.bash. Note: you will
need to have WPS compiled before
– in oasis in: create oasis grids for wrf.sh and eventually create oasis restart from preex

• Running: some scripts are provided: run frc XXX and run cpl and exam-
ples of job for ADA and DATARMOR clusters.

• Note that different readme files are placed in each directory to drive you!

18
5 Compile
5.1 Set up you environment

First, set up your environment (in your .bashrc or your local terminal):
- choose netcdf library (your netcdf library needs to be compiled with the compil-
ers you will use to compile the models): set the following environment variables:
NETCDF= YOUR_NETCDFPATH
NETCDF_CONFIG =$NETCDF/bin/nc -config
OASISDIR= YOUR_OASIS_COMPILATION_DIR

- choose compilers

Note that you need to compile OASIS before compiling models.

5.2 Tips in case you encounter errors during compilation

In case of strange errors during compilation (e.g., ”catastrophic error: could not
find ...”), try one of these solutions:

• check your home space is not full ;-)

• check your paths to compilers and libraries (especially Netcdf library)

• check that you have the good permissions, and check that your executable
files (configure, make...) do are executable

• check that your scripts (configure, makefile...) shell headers are correct or
add them if necessary (eg. for bash: #!/bin/bash)

• try to exit/log out the machine, log in back, clean and restart compilation

Errors and tips related to netcdf library:

• with netcdf 4.3.3.1: need to add the following compilation flag for all mod-
els: -mt_mpi (for WRF add this flag in: CFLAGS_LOCAL and FCBASEOPTS_NO_G
in configure.wrf). The error associated to a missing -mt mpi flag is of this
type: ” /opt/intel//impi/4.1.1.036/intel64/lib/libmpi mt.so.4: could not
read symbols: Bad value ”

• with netcdf 4.1.3: do NOT add -mt_mpi flag

• with netcdf4, need to place hdf5 library path in your environment:


export LD_LIBRARY_PATH=YOUR_HDF5_DIR/lib:$LD_LIBRARY_PATH

• with netcdf 4, if you use the library splitted in 2: C part and Fortran part,
you need to place links to C library before links to Fortran library and need
to put both path in this same order in your LD_LIBRARY_PATH

In case of ’segmentation fault’ error:

• try to allocate more memory with ”unlimited -s unlimited”

• try to launch the compilation as a job (batch) with more allocated memory

19
5.3 Few information on compilation options

A very summarized information on compilation options is given. For further


details, search information on the web, on with your cluster assistance team.
Also useful informations can be found on this page: http://www.idris.fr/ada/
ada-comp_options.html.

• Optimization options:

– -O0, -O1, -O2, -O3, -fast : optimization level. -O0 is no optimization,


use it for debug. -O3 and -fast are more agressive optimization options
that can lead to problems in reproducibility of your run (especially it
is better to avoid -fast).
– -xCORE-AVX2 : vectorization option
– -fno-alias, -no-fma, -ip : other optimization options, commonly used
– -ftz: set to 0 denormal very small numbers. It is set by default with
-O1, -O2, -O3 (can be a problem in calculation precision)

• Debug options: -O0 -g -debug -fpe-all=0 -no-ftz -traceback -check all -


fbacktrace -fbounds-check -finit-real=nan -finit-integer=8888

• Precision and writing options:

– -fp-model precise: important to have good precision and reproducibil-


ity of your calculations
– -assume byterecl: way of writing: byte instead of bit
– -convert big endian: way of writing binaries (important for avoiding
huge negative numbers)
– -i4, -r8: way of writing integers and reals (important also for repro-
ducibility between different clusters)
– -72: specifies that the statement field of each fixed-form source line
ends at column 72.
– -mcmodel=medium -shared-intel : do not limit memory to 2Go for
data (useful for writing large output files)

20
5.4 OASIS

in $HOME/oasis/OASIS3-MCT_3.0_branch/oasis3-mct folder you will find:

• doc: oasis documentation

• lib: mct, psmile and scrip libraries folders

• util: make_dir folder : TopMakefileOasis3 make.inc make.*

cd $HOME/oasis/OASIS3 -MCT_3 .0 _branch/oasis3 -mct/util/make_dir

Edit or create make.YOURMACHINE : set the different paths and compilers. WARN-
ING: set the absolute full path (no use of shortcuts like ”˜ ” or relative paths).
Edit make.inc to point to your make.YOURMACHINE file (absolute path is manda-
tory).
Then,
make realclean -f TopMakefileOasis3 > oasis_clean.out
make -f TopMakefileOasis3 > oasis_make.out

Errors during OASIS compilation are often associated to:

• problem of non-executable files (configure, etc that need to be executable


files),

• problem in paths in make.YOURMACHINE

• compilation options that have also to be set carefully

5.5 CROCO

CROCO needs to be compiled for each configuration (domain, coupled, uncou-


pled, parameterizations...), i.e., each time you change something in cppdefs.h or
param.h .
In your configuration directory CONFIGS/YOURCONFIG you will need to edit the
following files before compilation:

• param.h .

• cppdefs.h .

In param.h:

Change dimensions:
# elif defined YOURCONFIG
parameter (LLm0 =73, MMm0 =60, N=32) ! YOURCONFIG

And choose parallelisaton settings:


#ifdef MPI
integer NP_XI , NP_ETA , NNODES
parameter (NP_XI =1, NP_ETA =2, NNODES=NP_XI * NP_ETA)
parameter (NPP =1)

In cppdefs.h, set:

21
/ * Configuration Name * /
# define YOURCONFIG
/ * Parallelization * /
# undef OPENMP
# define MPI

Note that MPI is mandatory for coupling, even if the run is launched on 1 CPU.

For coupling or not with the atmospheric model set:


#define OA_COUPLING
or
#undef OA_COUPLING

For coupling or not with the wave model set:


#define OW_COUPLING
or
#undef OW_COUPLING

Set other parameterizations you need.


Note that coupling with waves have been tested with KPP boundary layer scheme
and works fine. It is also implemented with GLS MIX2017 but has not been
widely tested... If you want to use GLS, you should first compare your results
with those given with KPP before drawing any conclusions. And please, give a
feedback of your experience to the CROCO developer team!

To compile CROCO, edit and run make CROCO compil or set the following in
jobcomp and launch it: ./jobcomp
SOURCE=$HOME/croco/OCEAN
SCRDIR=$HOME/CONFIGS/YOURCONFIG/croco_in/Compile

# if you are using a linux architecture


LINUX_FC= your_fortran_compiler

# If needed set your own NETCDF directories


NETCDFLIB="-LYOUR_NETCDF_PATH /lib -lnetcdff -lnetcdf"
NETCDFINC="-IYOUR_NETCDF_PATH /include"

# set MPI directories if needed


MPIF90=" YOUR_MPI_COMPILER_PATH /bin_dir/ your_mpi_compiler "
MPILIB="-LYOUR_MPI_COMPILER_PATH /lib_dir -lmpi"
MPIINC="-IYOUR_MPI_COMPILER_PATH /include_dir"

# set OASIS -MCT (or OASIS3) directories if needed


PRISM_ROOT_DIR = YOUR_OASIS_COMPILE_PATH

Note: If compilation aborts, you may need to change the following lines in job-
comp:
LDFLAGS1="$LDFLAGS1 $LIBPSMILE"
CPPFLAGS1="$CPPFLAGS1 ${ PSMILE_INCDIR }"
FFLAGS1="$FFLAGS1 ${ PSMILE_INCDIR }"

to:
LDFLAGS1="$LDFLAGS1 $LIBPSMILE $NETCDFLIB"
CPPFLAGS1="$CPPFLAGS1 ${ PSMILE_INCDIR } $NETCDFINC"
FFLAGS1="$FFLAGS1 ${ PSMILE_INCDIR } $NETCDFINC"

22
And compile again.

5.6 WW3

To set up the WW3 compilation, you first need to edit your switch file: switch_YOURSWITCH
(it can be switch_OASACM or switch_OASOCM or switch_NOCOUPL or a switch file
you edited).
Note that for coupling some switches are mandatory:

DIST MPI COU OASIS OASOCM and/or OASACM

In addition, the switches CRT and WNT, used for current and wind forcing in-
terpolation, have to be set to 0 in coupled cases, and in forced case if you want to
compare it to an equivalent coupled case.

You can launch the compilation either by editing and launching make WW3 compil,
or by doing it manually step by step:

• First you need to create and edit $HOME/.wwatch3.env:

# copy it from ww3 model sources


cp $HOME/ww3/trunk/model/wwatch3.env $HOME /. wwatch3.env
# then edit compilers and paths

• Then, go in the ww3 bin directory, edit your switch file, and compile the
model for each case: serial or parallel, uncoupled, coupled with the ocean,
coupled with the atmosphere, coupled with both the atmosphere and the
ocean. Note that in the following, you have to set YOURFORTRANCOM-
PILER (e.g., Intel, check the different compiler available: comp.*). If not
already done, you need to define your NetCDF type before compiling and
setup some environment variables:

export WWATCH3_NETCDF =NC4 # for NETCDF 4


export WWATCH3_NETCDF =NC3 # for NETCDF 3
export NETCDF= YOUR_NETCDFPATH
export NETCDF_CONFIG =$NETCDF/bin/nc -config # for NETCDF 4
export NETCDF_LIBDIR =$NETCDF/lib # for NETCDF 3
export NETCDF_INCDIR =$NETCDF/include # for NETCDF 3
export OASISDIR= YOUR_OASIS_COMPILATION_DIR
# go to the ww3 bin directory
cd $HOME/ww3/trunk/model/bin
# clean , setup for your switch case
./ w3_clean -c
./ w3_setup .. -c YOURFORTRANCOMPILER -s YOURSWITCH -q
# launch compilation manually for the programs you need or with
make_MPI for compiling all programs
./ w3_make ww3_prnc ww3_grid ww3_bounc ww3_strt ww3_shel ww3_ounf
# move your executables to a dedicated directory
mkdir ../ exe_YOURSWITCH
mv ../ exe/ * ../ exe_YOURSWITCH

23
5.7 WRF

You need to compile the model for coupled and uncoupled cases, but just once
and not for each configuration contrarily to CROCO and WW3. First configure
your compilation:
cd $HOME/wrf/WRFV3 .7.1
./ configure

Choose distributed memory option (dm) and compiler option in adequation with
your machine setup.

Notes:

1. If you are going to create model output file that is more than 2Gb, you should
consider using netCDF large file support function. To activate this, one must set
the environment variable WRFIO_NCD_LARGE_FILE_SUPPORT.
In c-shell environment, do:
setenv WRFIO_NCD_LARGE_FILE_SUPPORT 1

In bash environment, do:


export WRFIO_NCD_LARGE_FILE_SUPPORT =1

2. Since V3.2, WRF supports using multiple processors for compilation. The
default number of processors used is 2. But if you have any problem with com-
pilation, please try using one processor to compile. To do this, set the following
environment variable before compile:
In c-shell environment, do:
setenv J "-j 1"

In bash environment, do:


export J="-j 1"

Note that WRF compilation will take a while (about 1h) and may take a lot of
memory. You may need to launch compilation in a job.

WRF is strict on netcdf dependencies, meaning that problems during compilation


are often due to netcdf settings. WRF uses:
• NETCDF environment variable that can be set before launching configure,
otherwise configure will ask you to provide your netcdf full path
• NETCDF4 environment variable that can be set to 1 if you want to use netcdf
4 facilities (if your netcdf library allows it). When using netcdf4 library,
check if all dependencies are properly set, they are usually found with nf-
config --flibs command
• always check all the lines associated to netcdf library and dependencies in
the generated configure.wrf, e.g., :

#### NETCDF4 pieces

NETCDF4_IO_OPTS = -DUSE_NETCDF4_FEATURES -
DWRFIO_NCD_LARGE_FILE_SUPPORT

24
GPFS =
CURL = # can be set to -lcurl
HDF5 = # can be set to -lhdf5 -lhdf5_hl
ZLIB = # can be set to -lz
DEP_LIB_PATH = # can be pointing to hdf5 library
NETCDF4_DEP_LIB = $(DEP_LIB_PATH ) $(HDF5) $(ZLIB) $(GPFS) $(CURL)

INCLUDE_MODULES = # last one is netcdf include path

LIB_EXTERNAL = # last one is netcdf library and it dependencies

5.7.1 Uncoupled compilation


You can use make WRF compil script provided in the Coupling tools of croco tools,
or compile on your own, step by step:
./ clean -a # clean before compiling
cp configure.wrf.backup configure.wrf # configure.wrf has been
changed to configure.wrf.backup during clean
./ compile em_real

If sucessful, this will create real.exe and wrf.exe in directory main/, and the
appropriate executables will be linked into the run/ directory.
Copy your executables in a new directory:
mkdir exe_uncoupled
cp configure.wrf exe_uncoupled /.
cp main / * . exe exe_uncoupled /.

5.7.2 Coupled compilation

./ clean -a # clean before compiling


cp configure.wrf.backup configure.wrf # configure.wrf has been
changed to configure.wrf.backup during clean

25
Edit configure.wrf as described below:

# Just before: #### Architecture specific settings #### , add for


OASIS:

OA3MCT_ROOT_DIR = $(OASISDIR)

# In: #### Architecture specific settings #### , modify the


following:

ARCH_LOCAL = -DNONSTANDARD_SYSTEM_FUNC -DWRF_USE_CLM -


Dkey_cpp_oasis3 # for OASIS add -Dkey_cpp_oasis3

# In: # POSTAMBLE , add includes and libraries associated to OASIS


before netcdf ones , as follows:

INCLUDE_MODULES = $( MODULE_SRCH_FLAG ) \
$( ESMF_MOD_INC ) $( ESMF_LIB_FLAGS ) \
-I$( WRF_SRC_ROOT_DIR )/main \
-I$( WRF_SRC_ROOT_DIR )/external/io_netcdf \
-I$( WRF_SRC_ROOT_DIR )/external/io_int \
-I$( WRF_SRC_ROOT_DIR )/frame \
-I$( WRF_SRC_ROOT_DIR )/share \
-I$( WRF_SRC_ROOT_DIR )/phys \
-I$( WRF_SRC_ROOT_DIR )/chem -I$( WRF_SRC_ROOT_DIR )/inc \
-I$( OA3MCT_ROOT_DIR )/build/lib/mct \
-I$( OA3MCT_ROOT_DIR )/build/lib/psmile.MPI1 \
-I$(NETCDFPATH)/include \

LIB_EXTERNAL = \
-L$( WRF_SRC_ROOT_DIR )/external/io_netcdf -lwrfio_nf \
-L$( OA3MCT_ROOT_DIR )/lib -lpsmile.MPI1 -lmct -lmpeu -lscrip \
-L$(NETCDFPATH)/lib -lnetcdff -lnetcdf

An example of configure.wrf.uncoupled and configure.wrf.coupled are pro-


vided in the Coupling tools.

Then you can compile WRF, once again using make WRF compil as follows:
CAUTION for compilation: WRF with OASIS requires a lot of memory (>3.5Go)
to compile. If needed submit a job with extra memory request for compiling.
./ compile em_real

If sucessful, this will create real.exe and wrf.exe in directory main/, and the
appropriate executables will be linked into the run/ directory.

Copy your executables in a new directory:


mkdir exe_coupled
cp configure.wrf exe_coupled /.
cp main / * . exe exe_coupled /.

26
6 Pre-processing before run
6.1 CROCO

CROCO preprocessing tools have been developed under Matlab software by IRD
researchers (former Roms_tools).
Note: These tools have been made to build easily regional configurations us-
ing climatological data. To use interannual data, some facilities are available
(NCEP, CFSR, QuickScat data for atmospheric forcing, SODA and ECCO for lat-
eral boundaries). However, to use other data, you will need to adapt the scripts.
All utilities/toolbox requested for matlab croco tools programs are provided within
UTILITIES directory that can be downloaded here: http://www.croco-ocean.
org/download/utilities/. Further details are fully described in the croco tools
documentation available online: http://www.croco-ocean.org/documentation/

6.1.1 Simple climatological configuration setup


To run CROCO preprocessing:

• Edit start.m and crocotools_param.m in your croco_in directory:

– start.m has to be launched at the beginning of any matlab session


to set the path to utilities and croco tools routines. Edit mypath and
myutilpath
– crocotools_param.m defines all the parameters and paths needed to
build the grid, forcing and boundary files. Edit the different sections.

• Steps for creating a configuration are:

– build the grid


– build the atmospheric forcing (not necessary when coupling with an
atmospheric model): 2 options are available:
* create a forcing file with wind stress (zonal and meridional com-
ponents), surface net heat flux, surface freshwater flux (E-P), solar
shortwave radiation, SST, SSS, surface net sensitivity to SST (used
for heat flux correction dQdSST for nudging towards model SST
and model SSS)
* or create a bulk file which will be read during the run to perform
bulk parameterization of the fluxes using COAMPS or Fairall 2003
formulation. This bulk file contains: surface air temperature, rela-
tive humidity, precipitation rate, wind speed at 10m, net outgoing
longwave radiation, downward longwave radiation, shortwave ra-
diation, surface wind speed (zonal and meridional components).
It also contains surface wind stress (zonal and meridional compo-
nents), but it is not requested and used in the model (except for
specific debugging work). The bulk formulation computes its own
wind stress.
– build the lateral boundary conditions (3D currents, temperature and
salinity, barotropic currents, surface elevation): 2 options are available:

27
* interpolate the oceanic forcing fields all over the domain: only
boundary points and sponge/nudging layers will be used. Advan-
tage: sponge/nudging layers at boundaries, Disadvantage: large
amont of data unused.
* or interpolate the oceanic forcing fields at the boundary points
only. Advantage: light files (useful for long simulations), Disad-
vantage: no sponge layers.
– build the initial conditions (3D currents, temperature and salinity, barotropic
currents, surface elevation)
• To create a simple configuration from climatological files, execute in mat-
lab:
start
make_grid
make_forcing # or make_bulk
make_bry # or make_clim
make_ini

• This will create: croco_grd.nc, croco_frc.nc (or croco_blk.nc),


croco_bry.nc (or croco_clim.nc), croco_ini.nc

6.1.2 Nested configuration setup


Nesting is performed in the model through the AGRIF library.
To create a nested configuration:

• First build the parent domain configuration as in previous section


• Then in matlab, you need to use the nestgui utility (write nestgui in matlab,
the nestgui utility will appear):
– Click ’1. Define child’, choose your parent grid file, then edit imin,
imax, jmin, jmax, refine coef to define your child grid
– ( If you want to change the topography input file for the child domain,
click ’new child topo’, choose your new input topo file and edit n-band
which is the number of grid points on which you will connect the par-
ent and child topography )
– Click ’2. Interp child’ to create the child grid
– Click ’3. Interp forcing’ or ’3. Interp bulk’ to interpolate the forcing or
bulk file on the child grid
– ( If you have changed the topography, Click ’Vertical interpolations’ )
– Click ’4. Interp initial’ or ’Interp restart’ to create initial or restart file
– Click ’5. Create croco.in’ to create croco.in file for child domain
– Click ’Create AGRIF_FixedGrids.in’ to create input file for AGRIF
– ( Note: ’Interp clim’ button can be used to create a climatology file (ie
boundary conditions) for the child to domain, to test the child domain
alone or to compare 1-way online nested run and offline nested run )
• This will create: croco_grd.nc.1, croco_frc.nc.1 (or croco_blk.nc.1),
croco_ini.nc.1, croco.in.1, AGRIF_FixedGrids.in

28
6.1.3 Rivers
If you want to include river runoff in your simulation domain, you have 2 options.
• Indicate fixed river sources in croco.in directly:

psource: Nsrc Isrc Jsrc Dsrc Qbar [m3/s] Lsrc Tsrc


2
3 54 0 -1 200. T T 5.
0.
3 40 0 -1 200. T T 5.
0.

where 2 is the number of rivers, then each line describe a river: 3, 54 are the i, j
indices where the river is positioned, 0, -1 indicate the orientation and direction
(here zonal towards the west), 200 is the runoff debit in m3/s, 5 and 0 are respec-
tively the temperature and salinity of the river. You can edit these parameters
• Or you can use an runoff input file of your choice (one is given in RUNOFF_DAI),
by running make_runoff in matlab. It will detect potential rivers in your
domain and ask you specifications for each river:
– do you want to use it: y or n
– what is the runoff orientation: zonal (0) or meridional (1)
– what is the runoff direction:
* if zonal: 1 towards the east, -1 towards the west
* if meridional: 1 towards the north, -1 toward the south
– Finally, it will creates croco_runoff.nc
– The program will also give you the few line you need to add in your
croco.in:
psource_ncfile : Nsrc Isrc Jsrc Dsrc qbardir Lsrc Tsrc
runoff file name
CROCO_FILES/ croco_runoff .nc
2
25 34 0 -1 30 * T 5 1
31 19 0 -1 30 * T 5 1

where 2 is the number of rivers, then each line describe a river: 25, 34 are
the i, j indices where the river is positioned, 0, -1 indicate the orientation
and direction (here zonal towards the west), 30*T are true/false indications
for reading or not the following variables (here temperature and salinity),
5 and 1 are respectively the temperature and salinity of the river. You can
edit these parameters. Temperature and salinity can also been read
– Note: the runoff has a default vertical profile defined in CROCO as
an exponential vertical distribution of velocity. It is in analytical.F,
subroutine ana_psource if you need to change it
• If you have a nest and a river in your nest, edit make_runoff: under
crocotools_param line add:
grdname = [CROCO_files_dir ,’croco_grd.nc.1 ’];
rivname = [CROCO_files_dir ,’ croco_runoff .nc.1 ’];

and run make_runoff again to generate croco_runoff.nc.1

29
6.1.4 Interannual simulations
To prepare interannual simulations:

• edit time settings (Yorig, Ymin, etc), paths and prefix (NCEP_dir, OGCM_dir,
OGCM_prefix) in crocotools_param.m

• in matlab, launch: start, make_grid, make_CFSR (or make_NCEP) for


atmospheric forcing (need to use bulk), and make_OGCM for ocean bound-
ary conditions and initialization.

• you can also use online interpolation of forcing. For that you will need
to first prepare your files, so that can be read by CROCO: use the script
Process_CFSR_files_for_CROCO.sh, then link the files to your croco files
directory and activate BULK and ONLINE cpp keys.

To run interannual simulations: use the script run_croco_inter.bash (edit the


paths, time step, time settings and run command)

6.2 WW3

Preprocessing tools for WW3 have been developed. They are available in the
GRIDGEN matlab package (a tutorial is available here: ftp://ftp.ifremer.fr/
ifremer/ww3/COURS/WAVES_SHORT_COURSE/TUTORIALS/TUTORIAL_GRIDGEN/waves-
workshop-exercise-gridgen.pdf).

Basic steps for regular grids:

• Define your grid parameters:


dx= ... # in degrees
dy= ... # in degrees
lon1d =[...: dx :...] # in degrees
lat1d =[...: dy :...] # in degrees
[lon ,lat]= meshgrid(lon1d ,lat1d);

• coastline (defined as polygons in coastal bound ....mat) and bathy (e.g., etopo1.nc)
files are used.

• Some threshold values are set up:


lim_wet =... ; # proportion of cell from which it is considered "
wet"
cut_off =0; # depth at which cell is considered as "wet"
dry_val =999; # value given to "dry" cells

• Grid can then be generated:


depth= generate_grid (lon ,lat ,ref_dir ,’etopo1 ’,’lim_wet ,cut_off ,
dry_val)

• Definition of boundaries:

30
lon_start=min(min(lon))-dx;
lon_end=max(max(lon))+dx;
lat_start=min(min(lat))-dy;
lat_end=max(max(lat))+dy;
coord =[ lat_start lon_start lat_end lon_end ];
[b,n]= compute_boundary (coord ,bound ,1);

• Mask generation (use of bathy and coastline):


m=ones(size(depth));
m(depth == dry_val)=0;
b_split= split_boundary (b,5 * max([dx dy])); # splitting to make
computation more efficient
lim_wet =0 ,5;
offset=max([dx ,dy]);
# mask cleaning remove lonely wet cells close to the coastline:
m2=clean_mask(lon ,lat ,m,b_split ,lim_wet ,offset);
cell_limit =-1 ; # if this value is negative all water bodies
except the larger are considered dry (\ie remove all lakes or
closed seas), if positive: has to be the minimum number of
cells to consider a body as water
glob =0 ; # if global or not
[m4 ,mask_map ]= remove_lake(m2 ,cell_limit ,glob);

To make a grid from another model grid:


• read bathymetry and mask from your model file

• write the bathymetry thanks to write ww3file function, note that WW3 is
expecting negative depth in the ocean:

write_ww3file ([ data_dir ,’/’,’bottomm2 ’,’.inp ’],depth ’. * ( -1));

• build the mask for WW3: mask=1 is water, mask=0 is for points which
won’t be computed, mask=2 for active boundary points

• write the mask file:

write_ww3file ([ data_dir ,’/’,’mapsta ’,’.inp ’],mm ’);

An example of program making ww3 grid settings from a CROCO grid file is pro-
vided in the Coupling tools: make_ww3_grd_input_files_from_croco_grd.m

WARNING: Do not put the mask to 0 all around your domain, it will create prob-
lems in OASIS interpolations. You can either set 1 for sea points or 2 for boundary
points.

Note: To use a wind forcing: prepare the wind forcing file with a valid time axis.
A few scripts for preparing ww3 forcing files from CROCO (current and water
level, WRF (wind) and CFSR (wind) files already processed through
Process_CFSR_files_for_CROCO.sh are provided:
• script_make_CROCO_current_and_level_for_ww3.sh

• script_make_WRF_wind_for_ww3.sh

31
• script_make_CFSR_wind_for_ww3.sh

WW3 routines are named ww3 ROUTINENAME and take as input file by de-
fault: ww3 ROUTINENAME.inp. You have to set parameters in these .inp input
files before running. Steps for WW3 pre-processing:
./ ww3_grid # To prepare the grid and run (NB: timesteps are
defined in ww3_grid.inp file)
./ ww3_prnc # To prepare wind forcing if you want to use one (not
mandatory)
./ ww3_strt # To prepare initialisation (not mandatory , will
take defalut rest state if not runned)
./ ww3_bounc # To prepare spectral boundary conditions (not
mandatory , will take initial state as boundary conditions if
not runned)

Notes on mask/mapsta and bathy in WW3


The input map status (MAPSTA) value in the mask file can be :

• -2 : excluded boundary points (sea points covered by ice)

• -1 : excluded sea points (sea points covered by ice)

• 0 : excluded points (land)

• 1 : sea points (ocean)

• 2 : active boundary points

• 3 : excluded

• 7 : ice

The final possible values of the output map status MAPSTA are :

• -5 : other disabled point

• -4 : point masked in the two-way nesting

• -3 : dry point covered by ice

• -2 : dry point, not covered by ice

• -1 : wet point covered by ice

• 0 : land point

• 1 : active sea point

• 2 : active boundary point

• 8 : excluded sea/ice point

• 7 : excluded sea point, considered iced

• 15 : excluded sea point, considered dried: can become wet

• 31 : excluded sea point, inferred in nesting

32
• 63 : excluded sea point, masked in 2-way nesting

Coastline limiting depth (m, negative in the ocean) defined in ww3 grid.inp will
also affect your MAPSTA: points with depth values above this coastline limit will
be transformed to land points and therefore considered as excluded points (never
become wet points, even if the water level grows over).

In the output of the model, the depth (dpt) is described as : DEP T H = LEV −
BAT HY , in which the bathy is negative in the sea and positive on land, so the
depth will be positive in the sea and a fillvalue on land. When the input water
level (LEV) increases, it increases the output depth (DPT) value. The input water
level forcing value is stored in WLV output variable, thus it gives the possibility
to retrieve the input bathy value at each grid point : BAT HY = W LV − DP T .

6.3 WRF - WPS

First you need to have WRF compiled and you must check that you have the
required libraries (for grib2 use only):

• jasper

• libpng

• zlib

or you need to install them:

• http://www.ece.uvic.ca/˜mdadams/jasper/ Go down to ”Downloads”

• http://www.libpng.org/pub/png/libpng.html Go down to ”source code”

• http://www.zlib.net/ Go down to ”The current release is publicly avail-


able here”

Note that not all the library versions works correctly with WPS. Already used
version are for example:

• jasper-1.900.1

• libpng-1.2.59

• zlib-1.2.8

Unzip or untar (tar -zxvf) and install:


cd $HOME/softs/jasper -VERSION
./ configure --prefix=$HOME/softs/jasper - YOURJASPERVERSION /install
make
make install

cd $HOME/softs/libpng -VERSION
./ configure --prefix=$HOME/softs/libpng - YOURLIBPNGVERSION /install
make check
make install

33
cd $HOME/softs/zlib -VERSION
./ configure --prefix=$HOME/softs/zlib - YOURZLIBVERSION /install
make
make install

And define the following environment variables:


export JASPERLIB="$HOME/softs/jasper - YOURJASPERVERSION /install/lib
-L$HOME/softs/libpng - YOURLIBPNGVERSION /install/lib -L$HOME/
softs/zlib - YOURZLIBVERSION /install/lib"
export JASPERINC="$HOME/softs/jasper - YOURJASPERVERSIONinstall /
include -I$HOME/softs/libpng - YOURLIBPNGVERSION /install/include
-I$HOME/softs/zlib - YOURZLIBVERSION /install/include"
export LD_LIBRARY_PATH =$HOME/softs/libpng - YOURLIBPNGVERSION /
install/lib: $LD_LIBRARY_PATH
export NETCDF= YOUR_NETCDFPATH

Note that if your library contain dynamical libraries (.so) you need to add the
path in LD LIBRARY PATH.

Download WPS and Geographical data as you did for WRF from http://www2.
mmm.ucar.edu/wrf/users/download/get_source.html. Geographical data will
be available following the link in ”here” under WPS download section. You can
download the full complete set, but note that topo files are not all in it. Down-
load them individually in addition.
Note that Geographical data file is a VERY LARGE file (∼ 49 Go uncompressed).
Uncompress them (tar xvjf or tar -zxvf).
Install WPS:
cd $YOUR_WORKDIR /wrf/WPSV3 .7.1
./ configure

Choose your compiler.


Then edit configure.wps:
WRF_DIR = YOUR_PATH_TO_WRFV3 .7.1
DM_FC = YOUR_MPI_FORTRAN_COMPILER
DM_CC = YOUR_MPI_C_COMPILER
WRF_LIB # check your netcdf library flags for dependencies (e.g.
hdf5)

Then:
./ compile >& compile.out

If compilation is successful, you will in your current WPS directory the following
links:
geogrid.exe -> geogrid/src/geogrid.exe
ungrib.exe -> ungrib/src/ungrib.exe
metgrid.exe -> metgrid/src/metgrid.exe

To run WPS, we have made a few scripts and namelists:


configure.namelist. wps_YOURCONFIG
run_wps.bash
namelist.real.base.complete
run_real.bash

34
• First download the boundary files you want to use for your WRF simulation
(e.g., CFSR, GDAS, ERA-i...) and place them in WORKDIR/DATA directory
• Check if a Vtable exists for your boundary data in
YOURWPSDIR/ungrib/Variable_Tables or adapt one for your data. Some
informations about Vtables are available there: http://www2.mmm.ucar.
edu/wrf/users/download/free_data.html
• You can use YOURWPSDIR/ungrib/g1print.exe utility to check the vari-
ables and their grid code in your data file ; usage : ./g1print.exe YOUR-
DATAFILE
• In HOME/CONFIGS/YOURCONFIG/wrf_in:
– set your domain configuration by filling
configure.namelist.wps_YOURCONFIG
– edit user settings in run_wps.bash : paths, MPI settings... You can
perform each step of WPS individually or all in one time by setting:
switch_geogrid, switch_ungrib, switch_metgrid

Note that geogrid.exe uses GEOGIRD.TBL which is defined for default geograph-
ical data fields. If you use specific fields, e.g., modis, you need to check your data
index because some fields has different code (e.g., modis water mask has code 17
instead of 16, which is default). Please read README file in geog data and check
your data fields index.

• Run WPS on NB PROCS (number of CPUs):

./ run_wps.bash configure.namelist. wps_YOURCONFIG NB_PROCS

If WPS is successful, you will obtain in WORKDIR/DATA/WPS_DATA/YOURCONFIG:


geo_em.d01.nc
geo_em.d02.nc
met_em.d01 ..... nc # numerous files where ’...’ are dates
met_em.d02 ..... nc # numerous files where ’...’ are dates

Then, you can prepare your input and boundary files for WRF run using real.
• Edit user settings in run_real.bash : paths, MPI settings...
Note: you need to use real.exe from uncoupled compilation even for a cou-
pled run
• Edit namelist.real.base if necessary
• Run real on NB PROCS (number of CPUs):

./ run_real.bash configure.namelist. wps_YOURCONFIG NB_PROCS

If real is successful, you will obtain in WORKDIR/YOURCONFIG/wrf_files:


wrfinput_d01_DATE
wrfbdy_d01_DATE
wrflowinp_d01_DATE # if sst_update is set to 1
wrfdda_d01_DATE # if nudging is activated
wrf * _d02_DATE # if you have 2 domains

35
7 Run
A brief description of stand-alone simple simulations with each model is given
for your information. Please refer to proper model documentations for more
details.

7.1 Uncoupled run

7.1.1 CROCO
Copy croco.in from croco/OCEAN repository to your
$HOME/CONFIGS/YOURCONFIG/croco_in directory and edit it, in particular:

• run time (NTIMES: number of time steps) and time stepping (dt[sec]: baro-
clinic time step, NDTFAST: number of barotropic time steps in one baro-
clinic time step)

• vertical coordinate parameters (accordingly to pre-processing): S-coord:


THETA S, THETA B, Hc (m)

• names of grid, forcing, bulk, climatology, boundary, restart, history, average


files

• are you starting from an initial (NRREC=1) or restart file (NRREC=X (where
X is a positive number) : start at the Xth time record in the restart file)

• Frequency of restart files (NRST: in number of time steps) and the way they
will be stored (NRPFRST=-1 : override old restarts at each restart time,
NRPFRST=0 : store all restart times in one file, NRPFRST= X (where X is a
positive number) : store X restart times in one file)

• Flag (LDEFHIS = T or F) and frequency of history outputs (NWRT: in num-


ber of time steps), and the way they will be stored (NRPFHIS=-1 : over-
ride old history outputs at each output time, NRPFHIS=0 : store all history
outputs in one file, NRPFHIS= X (where X is a positive number) : store X
history outputs in one file)

• Starting timestep for the accumulation of output time-averaged data (NTSAVG),


frequency of average outputs (NAVG: in number of time steps), and the way
they will be stored as for history files

• and other croco.in settings (outputs flags, etc)

To run CROCO, copy all inputs files (at least: croco grd.nc, croco bdy.nc, croco ini.nc,
croco.in) in your work directory and launch the model:
./ croco croco.in

Where croco is your executable compiled with all your desired options and pa-
rameterizations and croco.in is your namelist file for croco.

36
7.1.2 WW3
In your ww3 input directory you will find:
cd $HOME/CONFIGS/YOURCONFIG/ww3_in
ls
> ww3_grid.inp # model definition: frequency and time steps ,
parameterizations , grid , bathy , masks , boundary pts
> ww3_strt.inp # initial conditions
> ww3_shel.inp # run informations : input fields , run time ,
output fields

Edit these input files.


Run the wave model:
./ run_ww3

In run_ww3:

• PATHS to model sources, working directory, input and configuration files


are set

• OUTPUTS directory is created

• executables are copied to working directory

• input files are linked to the working directory

• steps for wave model are launched:

– ww3_grid :
* inputs: ww3_grid.inp and bathy file at least and files requested to
define the grid (.inp or .bot ...)
* outputs: mod_def.ww3, mask.ww3, mapsta.ww3
– ww3_strt : not mandatory (if not performed, the model will search for
a restart.ww3 file or will assume initializing from local wind or from
rest)
* inputs: ww3_strt.inp and mod_def.ww3, mask.ww3, mapsta.ww3
* outputs: restart.ww3
– ww3_prnc : if you want to use dynamical forcing fields. Not necessary if
you use homogeneous fields or existing (e.g. wind.ww3, current.ww3,
level.ww3, ice.ww3)
* inputs: ww3_prnc.inp and mod_def.ww3, mask.ww3, mapsta.ww3,
wind.nc (other forcingfile.nc)
* outputs: wind.ww3 (current.ww3, level.ww3, ice.ww3)
– ww3_shel : actual wave model launch
* inputs: all previous outputs
* outputs: log.ww3, (restartN.ww3, nestN.ww3, track_i.ww3: if
track output requested, out_grd.ww3: if mean wave parameters
resquested, tout_pnt.ww3: if point outputs requested)
– ww3_ounf : convert model output out_grd.ww3 to netcdf ww3DATE.nc

37
7.1.3 WRF
In your HOME/CONFIGS/YOURCONFIG/wrf_in directory, you will find
namelist.input.real.YOURCONFIG
Edit your namelist to set your parameterizations and time step (NB: time step
has to be 6*dx (in km) or lower). Note that a readme file: README.namelist can
be found in wrf run directory for description of the namelist variables.

Then to run the model you need to put in your working directory:
wrfinput_d01
wrfbdy_d01
wrflowinp_d01 # if sst_update is set to 1
wrfinput_d02 # if you have 2 domains
wrflowinp_d02 # if you have 2 domains
wrfdda_d01 # if nudging is activated
namelist.input
wrf.exe
and all the data files in wrf run directory (except namelist.input
and executables )

An example of launch script for running WRF is given in attachement.

Tips in case of errors when running WRF:

• segmentation fault:

– can be due to a bad time step (CFL issue): check your time step is 6*dx
(in km) or lower, try to decrease it if necessary.
– can be due to memory issues: try to allocate more memory to your job,
try to modify ulimit: ulimit -s unlimited

7.2 Coupled run

7.2.1 OASIS input files


OASIS namelist file that manages exchanges for the coupler is namcouple. It
contains:

• the number of coupled fields

• the number and names of model executables

• the total time of simulation (in sec)

• the level of verbosity for the coupler (useful in debug mode)

• the details on each coupled field: specific coupled name in each model,
coupling frequency, name of oasis restart file, grid, grid and time interpo-
lations. A detailed example is given here with description of each input:

# ------------------------------------
# CROCO (crocox) ==> WW3 (wwatch)
# ------------------------------------
#˜˜˜˜˜˜˜˜˜˜˜
# Field 1 : ssh : sea surface height (m)

38
#˜˜˜˜˜˜˜˜˜˜˜

# First line:
# name of SSH coupled variable in CROCO
# name of SSH coupled variable in WW3
# unused number
# coupling frequency (in sec)
# number of field transformation (grid and time interpolations )
# name of oasis restart file for this variable
# field status: commonly used are EXPORTED (usual run) or
EXPOUT (debug mode only: field will be written to a file at all
coupling time steps , NOTE: considerably increases computation
time !)
SRMSSHV0 WW3__SSH 1 3600 1 r-cro.nc EXPOUT

# second line:
# number of points in X on the sending model grid (here CROCO)
# number of points in Y on the sending model grid (here CROCO)
# number of points in X on the receiving model grid (here WW3)
# number of points in Y on the receiving model grid (here WW3)
# name of sending model grid as appearing in grids.nc , masks.nc ,
and areas.nc
# name of receiving model grid as appearing in grids.nc , masks.
nc , and areas.nc
# LAG: lag time. Has to be equal to the sending model time step
(here CROCO time step)
95 104 89 94 rrn0 ww3t LAG =+3600

# Third line:
# sending model grid description: P=periodical , R=regional , and
number of overlapping points
# same for receiving model grid
R 0 R 0

# Fourth line: list of transformations that will be performed by


the coupler.
# LOCTRANS for the time interpolation
# SCRIPR for the grid interpolation
LOCTRANS SCRIPR

# Next lines (one for each transformation ): type and parameters


for each transformation
# Fisrt: here , averaging over time
# Second: here
# type of interpolation (DISTWGT: distributed weight)
# source grid type (LR , D or U)
# field type (note that VECTOR fields are not supported
anymore , so will be treated as SCALAR)
# search restriction type (LATLON or LATITUDE)
# number of restriction bins
# number of neighbours used
# see oasis manual for more details
AVERAGE
DISTWGT LR SCALAR LATLON 1 4

You also need to create netcdf restart files for OASIS that will be used at the
first call to oasis get. These files must contain exchanged variables set in the
namcouple.

39
7.2.2 CROCO inputs
To run CROCO coupled to another model, you just need to have CROCO exe-
cutable (croco) compiled in coupled mode (see Compile section).

7.2.3 WW3 inputs


To run WW3 coupled to another model:
• First you need to have WW3 executable (ww3 shel) compiled in coupled
mode (see Compile section)
• Edit ww3 shel.inp as followed:
• Set to C the flag for the field you want to couple. Note that for coupling with
the ocean both current and level are mandatory (not possible to couple just
current or just level):
# for coupling with the ocean
C F Water levels
C F Currents
# for coupling with the atmosphere
C F Winds

• Enable the output type 7 to define time (the time step define here needs to
be equal to the model time step TGLOB), the sent fields and the received
fields
# for coupling with the ocean
T0M1 OHS DIR # TAW TWO # if you defined them in the namcouple
SSH CUR
# for coupling with the atmosphere
CHA
WND

• WARNING: Set the run dates in ww3 shel.inp according to the run time in
namcouple (if not, the run will not end correctly)

7.2.4 WRF inputs


To run WRF coupled to another model:
• First you need to have WRF executable (wrf.exe) compiled in coupled mode
(see Compile section)
• Edit CPLMASK variable in wrfinput d0X for all your coupled domains (CPLMASK
has to be set to 1 where you want coupling, and to 0 elsewhere)
• Edit options in namelist.input:
– in &physics: isftcflx = 5 if your are coupling with a wave model
– in &physics: sst update = 1 if your are coupling with an ocean model
– in &domains: num ext model couple dom = X : number of domains
of the other model you are coupling to WRF
• You need to create WRF grid file for OASIS. To do so, a script is provided.
Edit and run create oasis grids for wrf.sh

40
7.2.5 Tips in case of error during a coupled run
• Strange geometrical patterns in your fields like diagonals, horizontal
lines, etc, it is probably a problem during the interpolation phase: check
that you don’t have one model mask set to 0 all around the domain (it cre-
ates problem during oasis interpolation). It can also be due to memory is-
sues, try to allocate more memory to your job or to launch it on more CPUs.

• Coupling between CROCO and WW3 does not work: check that you cor-
rectly set up the coupled fields in ww3 shel.inp : both current and levels
have to be set to C and Type 7 section have to be carefully filled.

• Coupling does not work: check the name of the variables in your namcou-
ple. For CROCO or WRF check the number of the domain is set properly,
for example 0 for parent domain for CROCO, sent SST is : SRMSSTV0, for
WRF received SST from CROCO parent domain to WRF first nest will be:
WRF d02 EXT d01 SST (WRF domain d02: first nest will see the SST from
the first domain of the coupled model, i.e., parent domain of CROCO)

• If you get an error from OASIS as: ”field and data mismatch”: check you
masks.nc file: mask variables have to be integer type!

• Your coupled run do not start: check your OASIS restart files: they need
to contain all the coupled fields and they have to be 2D variables: no time
dimension and no vertical dimension allowed!

• If your coupled run start and stop in the middle with an error as: ”ER-
ROR: coupling skipped at earlier time, potential deadlock, ERROR: model
timestep does not match coupling timestep”: first check your time steps and
coupling time step. If they are set properly, it may be due to optimization
compilation options... Change your optimization to -O2 (instead of -O3 or
-fast), re-compile and re-launch your run.

• Problems in WW3:

– prnc: needs the time variable of your netcdf file to be ”time” and also
needs a correct time attribute
– grid: if you have FLAGTR in your namelist, you need an obstruction
file
– ounf: if your ww3 netcdf output file does not look ok (domain, time),
you probably have bad settings in ww ounf.inp
– open boundaries: if you are prescribing open boundaries, you need
to have the same spectrum discretization (frequencies, directions) that
your boundary spectrum

• non-reproductibility of your results or empty variables in you output files


in CROCO can be due to problems with optimization. Try to change com-
pilation options for less optimization but more robustness: change -O3 by
-O2

• In general the things to check if you coupled simulation is not running but
your forced simulations work properly:

41
– OASIS grids.nc and masks.nc files: names and size of the fields, values
of the mask
– OASIS restart files: names and size of the fields, only 2D fields, etc
– dimensions and names of grids in your namcouple and in your model
(same for WW3, stag for WRF, interior grid for CROCO i.e., as in
param.h)
– names of variables in your namcouple, and restart files, and model
sources
– time steps: coupling time step, models time steps (they need to be
multiple of the coupling time step, and for WW3 you need to set Type
7 for each global time step)
– run duration has to be the same in namcouple and the different models
– check you model log files
– check your job error file
– check OASIS debug files: nout.000000, and for each model: debug.root.01,
or debug.02.000000
– clean your workdir and re-launch your run...

8 Example files
Scripts and some example files are provided in croco tools/Coupling tools repos-
itory (see details in section 4).
An example of coupled configuration is also given in the Documentation page or
croco website: https://www.croco-ocean.org/documentation/.

42

You might also like