OpenMPI
To set up the environment for the OpenMPI use
Depending on the version, you might have to load additional modules until you can load OpenMPI:
module load GCC/11.3.0
module load OpenMPI/4.1.4
Available OpenMPI versions can be listed with module spider OpenMPI
. Specifying a version will list the needed modules: module spider OpenMPI/4.1.4
This will set environment variables for further usage. The list of variables can be obtained with
module show OpenMPI/4.1.4
The compiler drivers are mpicc
for C, mpif77
and mpif90
or mpifort
since v1.7 for Fortran, mpicxx
and mpiCC
for C++. To start MPI programs, mpiexec
is used.
We strongly recommend using the environment variables $MPIFC
, $MPICC
, $MPICXX
and $MPIEXEC
set by the module system in particular because the compiler driver variables are set according to the latest loaded compiler module. Example:
$MPIFC -c prog.f90
$MPIFC prog.o -o prog.exe
$MPIEXEC -n 4 ./prog.exe
In order to start your OpenMPI job, please use $MPIEXEC
and $FLAGS_MPI_BATCH
envvars:
$MPIEXEC $FLAGS_MPI_BATCH python3 my_mpi4py_script.py
Further information / Known issues
Your application is crashing from time to time on the new CLAIX nodes, but known to run well on old Bull nodes? Try to add
-x PSM2_KASSIST_MODE=none
to the command line,$MPIEXEC -x PSM2_KASSIST_MODE=none $FLAGS_MPI_BATCH ./a.out
OpenMPI is not ABI compatible between major versions (e.g. 1.10.x and 3.1.0), and sometimes even between ninor releases. Trying to start an old binary using a new [major] version of OpenMPI loaded ends in an undefined behaviour.