FluidityProject / fluidity

Fluidity
http://fluidity-project.org
Other
365 stars 115 forks source link

what version of external library is required or advised for Fluidity/4.1.18? #332

Closed LeiZ2 closed 1 year ago

LeiZ2 commented 3 years ago

Hi, I am new to building fluidity. I am now trying to build the latest revision Fluidity/4.1.18 as a private module on a supercomputer cluster. The procedure is as follow: a) compile external libraries as individual modules first. e.g. openmpi, gmsh, vtk, petsc, zoltan etc. b) compile fluidity/4.1.18 based on the modules installed and loaded in a).

In a), I find it clueless in terms of which revision shall I build for Fluidity/4.1.18. The mannual (https://github.com/FluidityProject/fluidity/blob/main/manual/external_libraries.tex) does introduce external libraries but it may be out of date. And there are some discussions on this #323 #327, but it seems it is not conclusive and does not cover all external libraries fluidity depends on. Shall I ask for a full list of external libraries and recommended revisions to compile for Fluidity/4.1.18? Many thanks!

Patol75 commented 3 years ago

Hi,

I can successfully build Fluidity on a cluster that runs CentOS 8.4.2105 using the module files and command listed below. I hope that can be of help.

Currently Loaded Modulefiles:
 1) pbs            3) intel-mkl/2020.3.304   5) udunits/2.2.26   7) vtk/8.2.0                9) cmake/3.18.2  11) fftw3/3.3.8   13) zoltan/3.83  
 2) hdf5/1.12.1p   4) python3/3.9.2          6) netcdf/4.8.0     8) openmpi/4.1.0(default)  10) gmsh/4.4.1    12) petsc/3.12.2  14) szip/2.1.1

./configure --enable-2d-adaptivity --with-blas=-L${INTEL_MKL_BASE}/lib/intel64 --with-lapack-dir=${INTEL_MKL_BASE}/lib/intel64 LDFLAGS="-lmkl_rt -lpthread" && make clean && make -j && make -j fltools
LeiZ2 commented 3 years ago

@Patol75 Many thanks for sharing your experience. It is very useful as a reference after comparing the revision you have used with the advised revision in the out-of-date manual (attached below). A lot of time saved! image I noticed that you only marked openmpi/4.1.0 as default revision of module for Fluidity/4.1.18, does this infer that building fluidity is less sensitive to other modules' revision, e.g. petsc?

Patol75 commented 3 years ago

(default) corresponds to default versions for the cluster I am using - it is not related to Fluidity. Additionally, I compiled Fluidity's main branch, but it is very similar to 4.1.18. About PETSc, as you already found out, you can use more recent versions than 3.12, but you may need some tweaks.

stephankramer commented 3 years ago

petsc versions 3.8-3.14 are expected to work indeed (if you include my merge #327 just now)

@tmbgreaves : the suggested versions in the manual look a little out-of-date indeed and might not be what we're testing against at the moment. If so, would you mind having a look at updating these?

LeiZ2 commented 3 years ago

@Patol75 Many thanks for explaining. I have managed to built openmpi/4.1.0, petsc/3.14, zoltan/3.83, but I got stuck when compiling vtk/8.2.0 with gcc on a cluster that runs Red Hat 4.8.5-39. The log of make error is attached. out_make_error.txt

I wonder what are the necessary modules needed to compile vtk/8.2.0 in step 4), and what options do you toggled on in step 6)?

vtk/8.2.0_gcc (5/Aug/2021)

1) cd /home/$PROJECT_NAME/install 2) Download VTK-8.2.0.tar.gz 3) Untar, cd to untarred directory. 4) module purge module load cmake/3.14.3 module load compiler/gnu/7/3.0 module load python-numpy/1.14.5 (system python/2.7) module load use.own (to load openmpi/4.1.0_gcc therein) module load openmpi/4.1.0_gcc 5) mkdir build cd build mkdir -p /home/$PROJECT_NAME/modules/vtk/8.2.0_gcc/ 6) ccmake -DCMAKE_INSTALL_PREFIX=/home/$PROJECT_NAME/modules/vtk/8.2.0_gcc .. -Wno-dev c to configure VTK_Group_MPI = ON VTK_WRAP_PYTHON = ON t to toggle advanced mode Module_vtkFiltersParallelMPI = ON Module_vtkParallelMPI = ON c to process configuration files (twice) g to generate build files 7) make -j4 | tee out_make_error.txt #stoped making at [25%], error 2

tmbgreaves commented 3 years ago

@stephankramer - agreed, good catch, they're massively out of date and need fixing, along with a general overhaul of the manual supporting software section. Time has not been on my side for Fluidity work over the past month or two but I'll put that on my list to overhaul, likely getting to it in September now.

tmbgreaves commented 3 years ago

@LeiZ2 - Apologies for coming to this late. I'd suggest having a look at the buildscripts repo for a recent build method on this - the deploy script for the UCL Young cluster might be a good one:

https://github.com/FluidityProject/buildscripts/blob/main/uk/ac/ucl/rc/young/build.sh

The Young cluster has compilers/MPI/python already present plus some common supporting packages, but has builds for most of the main supporting libraries.

Noting that all being well I'll update the script in the next day or two to switch out the local HDF5 build for a system module, so referencing the script before that commit may be a more complete option, which is:

https://github.com/FluidityProject/buildscripts/blob/670bf7f0d8e5b894b6523bbf6af4dfbf4e8429ee/uk/ac/ucl/rc/young/build.sh

Patol75 commented 3 years ago

@LeiZ2 I am not too sure, to be honest. The cluster I am using provides VTK, so there is no need to compile it. Nonetheless, I remember that, the last time I compiled VTK, I followed this guide. I hope that can help you as well.

LeiZ2 commented 3 years ago

@tmbgreaves Many thanks for the suggestion. It helps a lot! I wonder if module unload default-modules apr gcc-libs equals tomodule purge on YOUNG? i.e. are there still any other necessary modules running after this command?

tmbgreaves commented 3 years ago

Thanks @LeiZ2 ! Yes, alas, I think there are, though I'll go back and check. It's a fairly delicate operation to leave some of the infrastructure modules in place but switch out the compiler.

stephankramer commented 1 year ago

Closing due to inactivity