FluidityProject / fluidity

Fluidity
http://fluidity-project.org
Other
366 stars 115 forks source link

Scripts for building fluidity from scratch #272

Closed ritzvik closed 4 years ago

ritzvik commented 4 years ago

I've noticed that fluidity is using CI for testing but I can't really find the commands to install dependencies in Jenkinsfile. So I guess, the dependencies are already there in the docker image fluidity/baseimages:bionic-python3.

It would be really helpful if the steps to build the docker image(Dockerfile) can be provided.

I can't directly install as we are planning to tweak fluidity as a part of our college project.

Thanks, Ritwik

stephankramer commented 4 years ago

The docker file for fluidity/baseimages:bionic is available as docker/Dockerfile.bionic in the fluidity source. IIRC (but @tmbgreaves may confirm), the -python3 version is only a small tweak where python is rewired to point to python3. This is no longer necessary as fluidity's configure will now always use python3 (it tries python first but if that's still python 2, it uses python3 instead).

The dependencies are basically pulled in by installing the fluidity-dev package which is just an empty .deb package that depends on the build requirements:

$ apt show -a fluidity-dev
Package: fluidity-dev
Version: 1.0.88bionic1
Status: deinstall ok config-files
Priority: extra
Section: contrib/devel
Maintainer: Tim Greaves <tim.greaves@imperial.ac.uk>
Installed-Size: 10.2 kB
Depends: autoconf, automake, cmake, g++ (>= 4:4.6), gfortran (>= 4:4.6), ghostscript, git, gmsh, google-perftools, libarpack2-dev, libblas-dev, libboost-python1.65.1, libexodusii-dev, libglib2.0-dev, libgmp10, liblapack-dev, libmpfr6, libnetcdf-dev, libnetcdff-dev, libnetpbm10-dev, libpetsc3.8.3-dev, libqhull-dev, libspud-dev, libsupermesh-dev, libtool, libudunits2-dev, libvtk7-dev, libxml2-utils, libzoltan (>= 3.82), libzoltan-dev (>= 3.82), m4, openmpi-bin, python3-dev, python3-future, python3-h5py, python3-junit.xml, python3-lxml, python3-matplotlib, python3-meshio, python3-numpy, python3-scipy, python3-sympy, python3-uncertainties, python3-vtk7, spud-diamond, tcsh, time, trang, transfig, triangle-bin, valgrind
Download-Size: unknown
APT-Sources: /var/lib/dpkg/status
Description: This is an empty package that depends on and thus provides all
  packages needed to build fluidity and run all the tests in the main fluidity
  repository. Some required packages are taken from the local repository,
  some from the central Debian/Ubuntu repository.

There is a description of the build requirements, and which versions are supported in the manual as well.

ritzvik commented 4 years ago

@stephankramer Thanks for the reply. Actually I need to make some changes to PETSc and run fluidity on top of that. So I guess, I'll install fluidity-dev and try to replace PETSc with custom PETSc.

Thanks again for the clarification. I'll make some noise here If I get stuck. Closing for now.

ritzvik commented 4 years ago

Hi @stephankramer

Sorry to be opening this again. I actually intend to make fluidity work with PETSc configured with GPU. I thought that I can maybe set-up my custom PETSc and download rest of the libraries from fluidity-dev and just change the environment variable $PETSC_DIR to point at my PETSc, but it turned out be quite complex than that.

As the manual is quite outdated, I've the following questions.

Thanks for any help, Ritwik

gnikit commented 4 years ago

@ritzvik This might be of some use, there is a PR (#267) that is ready to merge, which also includes some comments up until which commit of PETSc Fluidity will compile against.

The following is what I used to configure PETSc mostly based on what the PPA uses to configure PETSc (the OPT flags are redundant)

./configure \
  --enable-debug=0 --with-debugging=0 --with-petsc-arch=arch-cxx-opt \
  --COPTFLAGS="-O3 -g" --CXXOPTFLAGS="-O3 -g" --FOPTFLAGS="-O3 -g" \
  --with-shared-libraries \
  --with-clanguage=C++ --with-fortran-bindings=1 \
  --download-fblaslapack --download-scalapack \
  --download-metis --download-parmetis --download-ptscotch --download-zoltan \
  --download-mumps --download-netcdf --download-zlib --download-hdf5 \
  --download-hypre --download-bison --download-ctetgen \
  --download-sowing --download-suitesparse

However, I don't know how well GPU configured PETSc plays with the other external dependencies.

ritzvik commented 4 years ago

@gnikit The information is immensely helpful. Thanks!!

ritzvik commented 4 years ago

Hi @gnikit @stephankramer ,

I was able to build fluidity on custom PETSc but some of the testcases are giving a weird warning which I'm unable to comprehend, and get stuck there. At last, I have to press ^C to terminate. It would be helpful, if I can know what's behind it.

Singularity fluidity_b1.simg:~/MTP/Hybrid-PDE-solver/even_sem/fluidity_gpu/scratch_fluidity/tests/petsc_readnsolve> make 
rm -f standing_wave.stat
rm -f fluidity.err* fluidity.log*
rm -f *.msh *.halo
rm -rf velocity_logs/
rm -f matrixdump.info
rm -f petsc_readnsolve.log* petsc_readnsolve.err*
rm -f velocity_matrixdump pressure_matrixdump
rm -rf standing_wave_[0-9]*
rm -rf *flredecomp*
gmsh -3 -bin src/standing_wave.geo -o standing_wave.msh
Info    : Running 'gmsh -3 -bin src/standing_wave.geo -o standing_wave.msh' [Gmsh 3.0.6, 1 node, max. 1 thread]
Info    : Started on Sun Jun 21 14:48:40 2020
Info    : Reading 'src/standing_wave.geo'...
Info    : Done reading 'src/standing_wave.geo'
Info    : Finalized high order topology of periodic connections
Info    : Meshing 1D...
Info    : Meshing curve 1 (extruded)
Info    : Meshing curve 2 (extruded)
Info    : Meshing curve 3 (extruded)
Info    : Meshing curve 4 (extruded)
Info    : Meshing curve 7 (extruded)
Info    : Meshing curve 8 (extruded)
Info    : Meshing curve 9 (extruded)
Info    : Meshing curve 10 (extruded)
Info    : Meshing curve 12 (extruded)
Info    : Meshing curve 13 (extruded)
Info    : Meshing curve 17 (extruded)
Info    : Meshing curve 21 (extruded)
Info    : Done meshing 1D (0.000159 s)
Info    : Meshing 2D...
Info    : Meshing surface 5 (extruded)
Info    : Meshing surface 14 (extruded)
Info    : Meshing surface 18 (extruded)
Info    : Meshing surface 22 (extruded)
Info    : Meshing surface 26 (extruded)
Info    : Meshing surface 27 (extruded)
Info    : Done meshing 2D (0.000425 s)
Info    : Meshing 3D...
Info    : Meshing volume 1 (extruded)
Info    : Subdividing extruded mesh
Info    : Swapping 0
Info    : Remeshing surface 5
Info    : Meshing surface 5 (extruded)
Info    : Remeshing surface 14
Info    : Meshing surface 14 (extruded)
Info    : Remeshing surface 18
Info    : Meshing surface 18 (extruded)
Info    : Remeshing surface 22
Info    : Meshing surface 22 (extruded)
Info    : Remeshing surface 26
Info    : Meshing surface 26 (extruded)
Info    : Done meshing 3D (0.000864 s)
Info    : Optimizing 3D mesh...
Info    : Done optimizing 3D mesh (9e-06 s)
Info    : 44 vertices 200 elements
Info    : Writing 'standing_wave.msh'...
Info    : Done writing 'standing_wave.msh'
Info    : Stopped on Sun Jun 21 14:48:40 2020

Singularity fluidity_b1.simg:~/MTP/Hybrid-PDE-solver/even_sem/fluidity_gpu/scratch_fluidity/tests/petsc_readnsolve> ls -l
total 60
-rw-r--r-- 1 rivu rivu   668 Jun 21 05:24 Makefile
-rw-r--r-- 1 rivu rivu  5127 Jun 21 05:24 petsc_readnsolve.xml
drwxr-xr-x 2 rivu rivu  4096 Jun 21 05:24 src
-rw-r--r-- 1 rivu rivu 11381 Jun 21 05:24 standing_wave.flml
-rw-r--r-- 1 rivu rivu  6745 Jun 21 14:48 standing_wave.msh
-rw-r--r-- 1 rivu rivu 11378 Jun 21 05:24 standing_wave_failing_pressure.flml
-rw-r--r-- 1 rivu rivu 11379 Jun 21 05:24 standing_wave_failing_velocity.flml

Singularity fluidity_b1.simg:~/MTP/Hybrid-PDE-solver/even_sem/fluidity_gpu/scratch_fluidity/tests/petsc_readnsolve> ../../tools/testharness.py --file petsc_readnsolve.xml 
fluidity
--------------------------------------------------------------------------------
which fluidity: /home/rivu/MTP/Hybrid-PDE-solver/even_sem/fluidity_gpu/scratch_fluidity/bin/fluidity
Revision: master:c6c026d649036753807f54ba7fb6debb77f12aea
Compile date: Jun 21 2020 06:28:57
OpenMP Support          no
Adaptivity support      yes
2D adaptivity support       yes
3D MBA support          no
CGAL support            no
MPI support         yes
Double precision        yes
NetCDF support          yes
Signal handling support     yes
Stream I/O support      yes
PETSc support           yes
Hypre support           yes
ARPACK support          no
Python support          yes
Numpy support           yes
VTK support         yes
Zoltan support          yes
Memory diagnostics      no
FEMDEM support          no
Hyperlight support      no
libsupermesh support        no
 --------------------------------------------------------------------------------

rm -f standing_wave.stat
rm -f fluidity.err* fluidity.log*
rm -f *.msh *.halo
rm -rf velocity_logs/
rm -f matrixdump.info
rm -f petsc_readnsolve.log* petsc_readnsolve.err*
rm -f velocity_matrixdump pressure_matrixdump
rm -rf standing_wave_[0-9]*
rm -rf *flredecomp*
gmsh -3 -bin src/standing_wave.geo -o standing_wave.msh
Info    : Running 'gmsh -3 -bin src/standing_wave.geo -o standing_wave.msh' [Gmsh 3.0.6, 1 node, max. 1 thread]
Info    : Started on Sun Jun 21 14:50:13 2020
Info    : Reading 'src/standing_wave.geo'...
Info    : Done reading 'src/standing_wave.geo'
Info    : Finalized high order topology of periodic connections
Info    : Meshing 1D...
Info    : Meshing curve 1 (extruded)
Info    : Meshing curve 2 (extruded)
Info    : Meshing curve 3 (extruded)
Info    : Meshing curve 4 (extruded)
Info    : Meshing curve 7 (extruded)
Info    : Meshing curve 8 (extruded)
Info    : Meshing curve 9 (extruded)
Info    : Meshing curve 10 (extruded)
Info    : Meshing curve 12 (extruded)
Info    : Meshing curve 13 (extruded)
Info    : Meshing curve 17 (extruded)
Info    : Meshing curve 21 (extruded)
Info    : Done meshing 1D (0.000156 s)
Info    : Meshing 2D...
Info    : Meshing surface 5 (extruded)
Info    : Meshing surface 14 (extruded)
Info    : Meshing surface 18 (extruded)
Info    : Meshing surface 22 (extruded)
Info    : Meshing surface 26 (extruded)
Info    : Meshing surface 27 (extruded)
Info    : Done meshing 2D (0.000377 s)
Info    : Meshing 3D...
Info    : Meshing volume 1 (extruded)
Info    : Subdividing extruded mesh
Info    : Swapping 0
Info    : Remeshing surface 5
Info    : Meshing surface 5 (extruded)
Info    : Remeshing surface 14
Info    : Meshing surface 14 (extruded)
Info    : Remeshing surface 18
Info    : Meshing surface 18 (extruded)
Info    : Remeshing surface 22
Info    : Meshing surface 22 (extruded)
Info    : Remeshing surface 26
Info    : Meshing surface 26 (extruded)
Info    : Done meshing 3D (0.000629 s)
Info    : Optimizing 3D mesh...
Info    : Done optimizing 3D mesh (6e-06 s)
Info    : 44 vertices 200 elements
Info    : Writing 'standing_wave.msh'...
Info    : Done writing 'standing_wave.msh'
Info    : Stopped on Sun Jun 21 14:50:13 2020
[warn] Epoll MOD(1) on fd 14 failed.  Old events were 6; read change was 0 (none); write change was 2 (del): Bad file descriptor
[warn] Epoll MOD(4) on fd 14 failed.  Old events were 6; read change was 2 (del); write change was 0 (none): Bad file descriptor
[warn] Epoll MOD(1) on fd 14 failed.  Old events were 6; read change was 0 (none); write change was 2 (del): Bad file descriptor
[warn] Epoll MOD(4) on fd 14 failed.  Old events were 6; read change was 2 (del); write change was 0 (none): Bad file descriptor
[warn] Epoll MOD(1) on fd 14 failed.  Old events were 6; read change was 0 (none); write change was 2 (del): Bad file descriptor
[warn] Epoll MOD(4) on fd 14 failed.  Old events were 6; read change was 2 (del); write change was 0 (none): Bad file descriptor
[warn] Epoll MOD(1) on fd 14 failed.  Old events were 6; read change was 0 (none); write change was 2 (del): Bad file descriptor
[warn] Epoll MOD(4) on fd 14 failed.  Old events were 6; read change was 2 (del); write change was 0 (none): Bad file descriptor
^Cpetsc_readnsolve: Running
petsc_readnsolve: Calling 'make input':
petsc_readnsolve: mpiexec -n 4 ../../bin/flredecomp -i 1 -o 4 -v -l standing_wave_failing_velocity standing_wave_failing_velocity_flredecomp;
mpiexec -n 4 ../../bin/flredecomp -i 1 -o 4 -v -l standing_wave_failing_pressure standing_wave_failing_pressure_flredecomp;
mpiexec -n 4 ../../bin/flredecomp -i 1 -o 4 -v -l standing_wave standing_wave_flredecomp;
make mesh_rename
rm -f *failing_velocity_flredecomp_CoordinateMesh* *failing_pressure_flredecomp_CoordinateMesh*
( mpiexec -n 4 fluidity -v2 -l standing_wave_failing_velocity_flredecomp.flml 2>&1 ) > /dev/null;
mv matrixdump velocity_matrixdump;
mkdir -p velocity_logs;
mv fluidity.log* fluidity.err* velocity_logs/;
( mpiexec -n 4 fluidity -v2 -l standing_wave_failing_pressure_flredecomp.flml 2>&1 ) > /dev/null;
mv matrixdump pressure_matrixdump;
rm matrixdump.info;
mpiexec -n 4 petsc_readnsolve -v -l -prns_filename velocity_matrixdump  standing_wave_flredecomp.flml Velocity;
mv petsc_readnsolve.err* petsc_readnsolve.log* velocity_logs/;
mpiexec -n 4 petsc_readnsolve -v -l -prns_filename pressure_matrixdump  standing_wave_flredecomp.flml Pressure;
rm -rf fluidity.err-*

petsc_readnsolve: Assigning variables:
import glob
velocity_error_logs=[]
for logname in glob.glob('velocity_logs/petsc_readnsolve.err-*'):
  f=open(logname)
  velocity_error_logs.append(f.read())
  f.close()
petsc_readnsolve: Assigning velocity_error_logs = []
import glob
pressure_error_logs=[]
for logname in glob.glob('petsc_readnsolve.err-*'):
  f=open(logname)
  pressure_error_logs.append(f.read())
  f.close()
petsc_readnsolve: Assigning pressure_error_logs = []
from numpy import array
import re

f=open('velocity_logs/fluidity.log-0')
log=f.read()
f.close()

reason_lines=re.findall('.*reason.*', log, re.MULTILINE)
reason_fields=array([i.split(' ')[0] for i in reason_lines])
reason_numbers=array([eval(i.split(':')[1]) for i in reason_lines])
velocity_solve_failed = reason_fields.compress(reason_numbers<=0)[0]=='DeltaU'
Variable computation raised an exception
--------------------------------------------------------------------------------
  1  from numpy import array
  2  import re
  3  
  4  f=open('velocity_logs/fluidity.log-0')
  5  log=f.read()
  6  f.close()
  7  
  8  reason_lines=re.findall('.*reason.*', log, re.MULTILINE)
  9  reason_fields=array([i.split(' ')[0] for i in reason_lines])
 10  reason_numbers=array([eval(i.split(':')[1]) for i in reason_lines])
 11  velocity_solve_failed = reason_fields.compress(reason_numbers<=0)[0]=='DeltaU'
--------------------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/rivu/MTP/Hybrid-PDE-solver/even_sem/fluidity_gpu/scratch_fluidity/python/fluidity/regressiontest.py", line 304, in run_python
    exec(self.code, varsdict)
  File "<string>", line 4, in <module>
FileNotFoundError: [Errno 2] No such file or directory: 'velocity_logs/fluidity.log-0'
--------------------------------------------------------------------------------
petsc_readnsolve: failure.

Summary of test problems with failures or warnings:
petsc_readnsolve.xml: F

Passes:   0
Failures: 1
Warnings: 0

Please note that I have to press ^C after the warnings, as it was taking too much time.

Thanks a lot, Ritwik

ritzvik commented 4 years ago

If it can be of any use, the singularity scripts that I have used to build the fluidity environment can be found here : https://github.com/ritzvik/Hybrid-PDE-solver/tree/bac28983efa7e42d1d21bcc261ab274621712649/even_sem/fluidity_gpu

The script is written in two parts, where b1 uses b0

stephankramer commented 4 years ago

Thanks for sharing that. Maybe we should put these singulariy scripts somewhere more prominent in our documentation if singulariy becomes more widespread (@agnus-g: were we discussing it recently?). We should also update and clean up our description of build dependencies indeed - also targeting the new Ubuntu LTS release. To come back to your questions:

angus-g commented 4 years ago

I was looking at Singularity for our HPC, but it didn't seem like the facility really wanted to support it. That said, I think it's a good idea, given how containerisation has taken off, and especially so if it gains further adoption. It certainly makes it much easier to deploy and distribute in a variety of environments.

jrper commented 4 years ago

I won't go as far as saying version 4.X of Parmetis is supported (you can never assume that all future versions of software will work), but 4.0 to 4.0.3 should all behave themselves.

ritzvik commented 4 years ago

Thanks @angus-g @stephankramer @jrbull for all your responses!

ritzvik commented 4 years ago

As it turns out, the rogue warning while running test/petsc_readnsolve only appears when PETSc is configured with openmpi and works fine when configured with mpich. That's sorted out.

Another thing I noticed is that, bin folder and binaries only get created when I run make THREADS=8 test. Is there any other way to create the binaries so that I do not need to run the full test suite each time. The commands I use to run fluidity, as defined in singularity file are :

    ## build and test fluidity
    ./configure --enable-2d-adaptivity
    make -j
    make -j fltools
    make makefiles

    ## make test is important, as it creates the folder "bin".
    ## the command can be stopped after it completes all the compilation to avoid testing.
    make THREADS=8 test || echo "****make test incomplete/failed/stopped****"

Thanks

gnikit commented 4 years ago

As it turns out, the rogue warning while running test/petsc_readnsolve only appears when PETSc is configured with openmpi and works fine when configured with mpich. That's sorted out.

Another thing I noticed is that, bin folder and binaries only get created when I run make THREADS=8 test. Is there any other way to create the binaries so that I do not need to run the full test suite each time. The commands I use to run fluidity, as defined in singularity file are :

    ## build and test fluidity
    ./configure --enable-2d-adaptivity
    make -j
    make -j fltools
    make makefiles

    ## make test is important, as it creates the folder "bin".
    ## the command can be stopped after it completes all the compilation to avoid testing.
    make THREADS=8 test || echo "****make test incomplete/failed/stopped****"

Thanks

Calling make makefiles implicitly calls clean-light which deletes the lib and bin directories. So I would just switch the order to make makefiles && make all -j && make test THREADS=8

stephankramer commented 4 years ago

Ah good catch @gnikit ! Just to add: you don't normally have to run make makefiles; It generates some include files for the makefiles named Makefile.dependencies which are included in the source. They only change if any of the use statements in the fortran files change, at which point the developer needs to run make makefiles and commit the changes to Makefile.dependencies, so these include files should always be up-to-date in master. The only reason we run it as part of the test suite is to check that this generation process isn't broken, and to automatically check that the include files are indeed up-to-date.

ritzvik commented 4 years ago

Thanks @gnikit @stephankramer

I've been able to set-up fluidity on PETSc GPU and made some changes in tools/petsc_readnsolve.F90 and the compiled binary is able to make calls to GPU. Do you have any suggestions which files should I target first to try them on GPU?

Also, it becomes tedious to run make -j fltools after changes. So any other way, so that I can compile only the changed files?

ritzvik commented 4 years ago

Okay, I'm closing this now. Thanks for all your help.