UCL / swiftsim

Modern astrophysics and cosmology particle-based code. Mirror of gitlab developments at https://gitlab.cosma.dur.ac.uk/swift/swiftsim
http://www.swiftsim.com
GNU Lesser General Public License v3.0
0 stars 0 forks source link

Settle on standard environment for Myriad #1

Open DanGiles opened 1 year ago

DanGiles commented 1 year ago

Test out with various configurations

Need to ensure the following are installed/enabled (./configure)

Run the test case (examples/EAGLE_low_z/EAGLE_25) with multiple MPI ranks

themkots commented 1 year ago

For a first attempt to standardise the build process: module purge module load gcc-libs/4.9.2 cmake/3.21.1 flex/2.5.39 git/2.32.0 gerun nano/2.4.2 dos2unix/7.3 giflib/5.1.1 userscripts/1.4.0 default-modules/2018 python/3.8.0 compilers/intel/2018/update3 mpi/intel/2018/update3/intel fftw/3.3.4-impi/intel-2017-update1 numactl/2.0.12 hwloc/1.11.12 parmetis/4.0.3/intel-2015-update2 gsl/2.4/intel-2017 metis/5.1.0/intel-2018 hdf/5-1.10.2-impi/intel-2018 make clean ./autogen.sh ./configure --with-tbbmalloc --with-gsl=/shared/ucl/apps/gsl/2.4/intel-2017 --with-metis=/shared/ucl/apps/metis/5.1.0/intel-2018

[ucaseko@login12.myriad:49 swiftsim]$ icc --version icc (ICC) 18.0.3 20180410 Copyright (C) 1985-2018 Intel Corporation. All rights reserved.

mpicc --version icc (ICC) 18.0.3 20180410 Copyright (C) 1985-2018 Intel Corporation. All rights reserved.

The above ./configure command produces the following configuration: intel-18.0.3-config.txt

make then is able to build swiftsim with no errors.

[ucaseko@login12.myriad:49 swiftsim]$ ./swift Welcome to the cosmological hydrodynamical code ______ _________________ / ___/ | / / _/ ___/_ __/ \__ \| | /| / // // /_ / / ___/ /| |/ |/ // // __/ / / /____/ |__/|__/___/_/ /_/ SPH With Inter-dependent Fine-grained Tasking

Version : 0.9.0 Revision: v0.9.0-1296-gdfc72358, Branch: master, Date: 2023-02-19 22:30:33 +0000 Webpage : www.swiftsim.com

Config. options: '--with-tbbmalloc --with-gsl=/shared/ucl/apps/gsl/2.4/intel-2017 --with-metis=/shared/ucl/apps/metis/5.1.0/intel-2018'

Compiler: ICC, Version: 18.0.20180410 CFLAGS : '-O3 -ansi-alias -xCORE-AVX512 -pthread -fno-builtin-malloc -fno-builtin-calloc -fno-builtin-realloc -fno-builtin-free -w2 -Wunused-variable -Wshadow -Werror -Wstrict-prototypes'

HDF5 library version : 1.10.2 FFTW library version : 3.x (details not available) GSL library version : 2.4

Usage: swift [options] [[--] param-file] or: swift [options] param-file or: swift_mpi [options] [[--] param-file] or: swift_mpi [options] param-file

themkots commented 1 year ago
themkots commented 1 year ago

hwloc link: https://www.open-mpi.org/projects/hwloc/doc/v2.4.1/a00366.php

themkots commented 1 year ago

CPU affinity: https://git.ecdf.ed.ac.uk/dmckain/xthi

DanGiles commented 1 year ago

The environment and job script that I have used to run on myriad.

[ucakdpg@login13 swiftsim]$ module list
Currently Loaded Modulefiles:
  1) gcc-libs/4.9.2                       8) screen/4.9.0                        15) tmux/3.3a                           22) fftw/3.3.4-impi/intel-2017-update1
  2) cmake/3.21.1                         9) gerun                               16) mrxvt/0.5.4                         23) numactl/2.0.12
  3) flex/2.5.39                         10) nano/2.4.2                          17) userscripts/1.4.0                   24) hwloc/1.11.12
  4) git/2.32.0                          11) nedit/5.6-aug15                     18) rcps-core/1.0.0                     25) parmetis/4.0.3/intel-2015-update2
  5) apr/1.7.0                           12) dos2unix/7.3                        19) compilers/intel/2018/update3        26) gsl/2.4/intel-2017
  6) apr-util/1.6.1                      13) giflib/5.1.1                        20) mpi/intel/2018/update3/intel        27) metis/5.1.0/intel-2018
  7) subversion/1.14.1                   14) emacs/28.1                          21) default-modules/2018                28) hdf/5-1.10.2-impi/intel-2018

mpi job script

#!/bin/bash -l

# Batch script to run an MPI parallel job under SGE with Intel MPI.

# Request ten minutes of wallclock time (format hours:minutes:seconds).
#$ -l h_rt=02:00:0

# Request 1 gigabyte of RAM per process (must be an integer followed by M, G, or T)
#$ -l mem=1G

# Request 15 gigabyte of TMPDIR space per node 
# (default is 10 GB - remove if cluster is diskless)
#$ -l tmpfs=15G

# Set the name of the job.
#$ -N EAGLE_25

# Request 2 mpi ranks.
#$ -pe mpi 36
export OMP_NUM_THREADS=18

# Set the working directory to somewhere in your scratch space.
# Replace "<your_UCL_id>" with your UCL user ID :
#$ -wd /home/ucakdpg/Scratch/swiftsim/examples/EAGLE_low_z/EAGLE_25
source /home/ucakdpg/Scratch/swiftsim/source_file
# Run our MPI job.  GERun is a wrapper that launches MPI jobs on our clusters.
#cat /proc/cpuinfo
#mpirun -n 2 ../../../swift_mpi  --cosmology --hydro --self-gravity --stars --threads=18 --pin eagle_25.yml 2>&1 | tee output.log
gerun /home/ucakdpg/Scratch/swiftsim/swift_mpi --cosmology --hydro --self-gravity --stars --threads=18 eagle_25.yml 2>&1 | tee output.log

The source_file is as follows:

module add fftw/3.3.4-impi/intel-2017-update1
module add numactl/2.0.12
module add hwloc/1.11.12
module add parmetis/4.0.3/intel-2015-update2
module add gsl/2.4/intel-2017
module add metis/5.1.0/intel-2018
module add hdf/5-1.10.2-impi/intel-2018