Build FVCOM4.1 (MPI) on ITO Subsystem-A using Intel Compilers

Table of Content

This page is a memorandum of building and executing [cci]FVCOM4.1[/cci] in [cci]MPI[/cci] using Intel compilers on the supercomputer Subsystem-A of ITO at Kyushu University. Those in [cci]Series[/cci] are introduced in this page. The building environment for [cci]Series[/cci] should be prepared. The source files are supposed to exist in [cci]FVCOM4.1/FVCOM_source[/cci] and the test case is [cci]Estuary[/cci] in [cci]FVCOM4.1/Examples/Estuary[/cci].

Preparation for METIS

METIS library requires to be installed for building for MPI. First [cci]load[/cci] Intel compilers:
$ module load intel/2018
Environmental variables for Intel C compiler should be made as follows:
$ export CC=icc
$ export CPP="icc -E"
Move to the directory of [cci]FVCOM4.1/METIS_source[/cci] where the [cci]METIS[/cci] source code exists and extract the source codes:
$ tar xf metis.tgz
$ cd metis
Open [cci]makefile[/cci] and edit the [cci]MOPT[/cci] at the line 12 as follows:
MOPT = -O3 -no-prec-div -fp-model fast=2 -xHost
The above option of [cci]MOPT[/cci] is the recommended value by the vendor. This [cci]makefile[/cci] should read [cci][/cci] in [cci]FVCOM4.1/FVCOM_source[/cci] that was prepared for [cci]Series[/cci] where [cci]FLAGs[/cci] do not matter. For this purpose, [cci]include[/cci] in this [cci]makefile[/cci] should be corrected as follows:
include ../../FVCOM_source/

Building METIS

To build [cci]METIS[/cci], just [cci]make install[/cci] in the source directory.
$ make install


The file of [cci][/cci] for [cci]Series[/cci] created in (here) requires slight modification for [cci]FLAG_4[/cci] and [cci]RANLIB[/cci] as follows:
PARLIB = -lmetis #-L/usr/local/lib -lmetis
Then comment out the setting for Intel compiler for [cci]Series[/cci] in [cci][/cci] and edit for Intel [cci]MPI[/cci] as follows:
# Intel/MPI Compiler Definitions (ITO-A@kyushu-u)
CPP = icc -E
CC = mpiicc
CXX = mpiicpc
CFLAGS = -O3 -no-prec-div -fp-model fast=2 -xHost
FC = mpiifort
DEBFLGS = #-check all -traceback
OPT = -O3 -no-prec-div -fp-model fast=2 -xHost
This setting is for [cci]FLAT MPI[/cci] and the option is the recommended values by the vendor.

makefile for building FVCOM4.1

In building for [cci]Series[/cci], [cci]mod_esmf_nesting.F[/cci] was deleted in [cci]makefile[/cci] to avoid link errors. In building for [cci]MPI[/cci], this modified makefile is valid while the original one is also no problem.

Minor correction of a source file

Similarly in the case of [cci]Series[/cci] introduced here, the comments at the line 131 of [cci]wreal.F[/cci] should be deleted to avoid warning during compilation:
# endif !!ice_embedding yding

Build FVCOM4.1 for MPI

In the directory of [cci]FVCOM_source[/cci], build [cci]FVCOM4.1[/cci] for [cci]MPI[/cci] as follows:
$ make
Taking some time and the executable file of [cci]fvcom[/cci] should be created.

Executing test case of Estuary as batch job

[cci]MPI[/cci] run may only work in batch job (not interactively). First move to [cci]FVCOM4.1/Examples/Estuary/run[/cci], open [cci]tst_run.nml[/cci] and edit the number of rivers to be [cci]0[/cci] (originally 3), which is a bug.
Copy the executable of [cci]fvcom[/cci] to [cci]FVCOM4.1/Examples/Estuary/run[/cci]. Move to [cci]FVCOM4.1/Examples/Estuary/run[/cci] and create a script named, e.g., [cci][/cci] containing the following:
#PJM -L "rscunit=ito-a"
#PJM -L "rscgrp=ito-ss-dbg"
#PJM -L "vnode=1"
#PJM -L "vnode-core=36"
#PJM -L "elapse=10:00"
#PJM -j

module load intel/2018


export I_MPI_FABRICS=shm:ofa

export I_MPI_HYDRA_BOOTSTRAP_EXEC=/bin/pjrsh

mpiexec.hydra -n $NUM_PROCS ./fvcom --casename=tst
Here this script for [cci]FLAT MPI[/cci] is supposed to use one node with 36 cores/node. The number of cores per node should be specified in [cci]NUM_CORES[/cci] and [cci]NUM_PROCS[/cci] denotes [cci]the number of nodes × the number of cores per node[/cci]. Further information is given in this page (in Japanese).
To submit the batch job, use the command of [cci]pjsub[/cci] as follows:
$ pjsub
To check the status of the job, invoke [cci]pjstat[/cci] command in the terminal. Further information about batch job if given in this page and in this page (in Japanese).

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Next article

DEM and Coastline data