boutproject / BOUT-dev

BOUT++: Plasma fluid finite-difference simulation code in curvilinear coordinate systems
http://boutproject.github.io/
GNU Lesser General Public License v3.0
182 stars 95 forks source link

Problems with building PETSc as described in user_manual #90

Closed loeiten closed 9 years ago

loeiten commented 9 years ago

I'm not sure if it is a general problem, or if it is just me, but I seem not to be able to build PETSc the way that it is described in the user_manual, neither on my laptop, nor on a cluster.

I did not manage to build PETSc by

./configure --with-fortran=0 --with-c++-support=1 --with-mpi=1 --with-sundials=1 --with-sundials-dir=$HOME/local/

* nor by

./configure --with-fortran=0 --with-c++-support=1 --with-mpi=1

suggested in the manual, but I did manage to build it in the following way (suggested by @johnomotani)

./configure --download-mumps --download-scalapack --download-blacs --download-f-blas-lapack=1 --download-parmetis --download-ptscotch --download-metis --with-clanguage=cxx --with-c++-support=1 --with-mpi=1 --download-sundials
make all test

Also the user_manual suggest to use

hg clone http://petsc.cs.iit.edu/petsc/petsc-dev

(it looks like this is a dead link) or to use instructions found on http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html, which tells the user to use

git clone https://bitbucket.org/petsc/petsc

This would give the user PETSc 3.6, which BOUT++ currently has no interface for.

If it is not planned to make wrappers to PETSc 3.4**, 3.5 or 3.6, maybe it would be a good idea to explicitly state that PETSc 3.3 is needed. I suggest then to write the following in the manual

In order to use PETSc (...)
wget http://ftp.mcs.anl.gov/pub/petsc/release-snapshots/petsc-3.3-p7.tar.gz
(...) and build with
./configure --download-mumps --download-scalapack --download-blacs --download-f-blas-lapack=1 --download-parmetis --download-ptscotch --download-metis --with-clanguage=cxx --with-c++-support=1 --with-mpi=1 --download-sundials
make all test

or something like that.

* I found it rather hard to try to build sundidals on it's own, and then link it to PETSc as the user_manual currently suggests, maybe it would be OK to tell the user to install sundails through PETSc if both are to be used. ** PETSc 3.4 complains about SNESDefaultComputeJacobian (which has changed names in newer version) when building the imex solvers. Newer versions have move a bit around on the folder system.

d7919 commented 9 years ago

When you say you can't get PETSc to build which version are you trying to build and what goes wrong?

One issue is that sundials may not be configured in a way that's compatible with the PETSc setup, which is why --download-sundials is a nice option for PETSc. Unfortunately this means you can't necessarily control the sundials configuration, for instance I'm not sure if it will be using the most recent version (unlikely), which would mean you wouldn't have access to the new arkode solver. As sundials uses a cmake build approach it's hard for me to show you a configuration that I have used successfully with PETSc but as far as I can tell I didn't do anything special except perhaps set the precision to double and enabled both c and c++ but this may be the default.

With PETSc 3.4 I've recently used the following configuration options successfully with a self built sundials:

./configure --with-sundials-dir=/path/to/sundials_2.6.1 --with-clanguage=cxx --with-mpi=yes --with-precision=double --with-scalar-type=real --with-shared-libraries=0 --with-debugging=yes --with-make-np=64

Note, this is using a sundials build which has precision set to double and has enabled both c and c++ (maybe this is the default). A very similar configure line worked for PETSc 3.3.7 as well.

The most recent commit adds a PETSc 3.4 solver (and experimental 3.5 solver) so you may want to try using

wget http://ftp.mcs.anl.gov/pub/petsc/release-snapshots/petsc-3.4.5.tar.gz

The manual could probably do with being updated to reflect some of the changes that have been made to the PETSc build process etc.

The second most recent commit to master shows the fix needed for v3.4, which should also be applied to the imex solver (Ben mentioned he has fixed this in the laplace_n0 branch).

bendudson commented 9 years ago

Unfortunately the PETSc material in the manual is quite out of date. I guess it was written around the time Sean Farley was working on it, around 3.1/3.2 time.

On 1 September 2015 at 14:18, David Dickinson notifications@github.com wrote:

When you say you can't get PETSc to build which version are you trying to build and what goes wrong?

One issue is that sundials may not be configured in a way that's compatible with the PETSc setup, which is why --download-sundials is a nice option for PETSc. Unfortunately this means you can't necessarily control the sundials configuration, for instance I'm not sure if it will be using the most recent version (unlikely), which would mean you wouldn't have access to the new arkode solver. As sundials uses a cmake build approach it's hard for me to show you a configuration that I have used successfully with PETSc but as far as I can tell I didn't do anything special except perhaps set the precision to double and enabled both c and c++ but this may be the default.

With PETSc 3.4 I've recently used the following configuration options successfully with a self built sundials:

./configure --with-sundials-dir=/path/to/sundials_2.6.1 --with-clanguage=cxx --with-mpi=yes --with-precision=double --with-scalar-type=real --with-shared-libraries=0 --with-debugging=yes --with-make-np=64

Note, this is using a sundials build which has precision set to double and has enabled both c and c++ (maybe this is the default). A very similar configure line worked for PETSc 3.3.7 as well.

The most recent commit adds a PETSc 3.4 solver (and experimental 3.5 solver) so you may want to try using

wget http://ftp.mcs.anl.gov/pub/petsc/release-snapshots/petsc-3.4.5.tar.gz

The manual could probably do with being updated to reflect some of the changes that have been made to the PETSc build process etc.

The second most recent commit to master shows the fix needed for v3.4, which should also be applied to the imex solver (Ben mentioned he has fixed this in the laplace_n0 branch).

— Reply to this email directly or view it on GitHub https://github.com/boutproject/BOUT-dev/issues/90#issuecomment-136716895 .

loeiten commented 9 years ago

Thank you for you answers :). I saw the new commit first after raising the issue.

So basically I tried with different versions, and I got all sorts of problems, all from linking problems to problems finding mpicc etc. Unfortunately I didn't write them down, and I do not remember the specific details.

I'm currently trying to build with the approach you suggested by @d7919 , and it seems to go well (although not finished). It looks like the manual part on how to build sundials also need to be freshened a bit. So, if it all goes well, I suggest that I write the steps down and add them to the manual.

Are there any plans on incorporate PETSc 3.6 as well?

bendudson commented 9 years ago

My understanding is that the 3.5 interface is still experimental, and quite a bit changed from 3.4. The SLEPc interface is currently 3.4, so that's the version that has been used most recently. Once 3.5 is working and tested we should start on 3.6.

On 1 September 2015 at 15:13, loeiten notifications@github.com wrote:

Thank you for you answers :). I saw the new commit first after raising the issue.

So basically I tried with different versions, and I got all sorts of problems, all from linking problems to problems finding mpicc etc. Unfortunately I didn't write them down, and I do not remember the specific details.

I'm currently trying to build with the approach you suggested by @d7919 https://github.com/d7919 , and it seems to go well (although not finished). It looks like the manual part on how to build sundials also need to be freshened a bit. So, if it all goes well, I suggest that I write the steps down and add them to the manual.

Are there any plans on incorporate PETSc 3.6 as well?

— Reply to this email directly or view it on GitHub https://github.com/boutproject/BOUT-dev/issues/90#issuecomment-136734900 .

d7919 commented 9 years ago

Yes the 3.5 part is highly experimental, as in I'm not sure it would even run (the original commit message [from myself a year ago] says "this should be treated as incomplete/untested"). I can have a look into getting 3.5 more robust if there's demand (I guess maybe I should do this soonish, before the BOUT-3.0 release?). My main approach was to simply create a copy of the 3.4 solver and then go through the PETSc 3.5 change log to see if any of the changed routines where used directly by the solver. I seem to recall the change log was quite long for 3.5.

bendudson commented 9 years ago

Sorry, you did advise leaving out the 3.5 bit for now, but I went ahead anyway. I think it's ok to leave as-is, with a health warning in the manual maybe. When a 3.0 release repository is made (say Monday next week) we could delete things in there which are not working yet.

On 1 September 2015 at 15:24, David Dickinson notifications@github.com wrote:

Yes the 3.5 part is highly experimental, as in I'm not sure it would even run (the original commit message [from myself a year ago] says "this should be treated as incomplete/untested"). I can have a look into getting 3.5 more robust if there's demand (I guess maybe I should do this soonish, before the BOUT-3.0 release?). My main approach was to simply create a copy of the 3.4 solver and then go through the PETSc 3.5 change log to see if any of the changed routines where used directly by the solver. I seem to recall the change log was quite long for 3.5.

— Reply to this email directly or view it on GitHub https://github.com/boutproject/BOUT-dev/issues/90#issuecomment-136738688 .

d7919 commented 9 years ago

It's probably good to have the 3.5 in master as it makes it more likely that I (or someone else) will get it in working order. It looks like I put a warning message into the output of 3.5 so even if someone doesn't realise that 3.5 is risky they should at least be aware of it when they look at their output.

loeiten commented 9 years ago

So...turns out that I have problems building with the method suggested by @d7919. I'm encountering the good old

             Configuring PETSc to compile on your system                       
===============================================================================
TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:145)                                                                                                                                                             *******************************************************************************
         UNABLE to CONFIGURE with GIVEN OPTIONS    (see configure.log for details):
-------------------------------------------------------------------------------
--with-sundials-dir=/home/mmag/sundials_2.6.1 did not work
*******************************************************************************

the same has happened to me previously. (Also happened when I tried sundials_2.6.2 ) The exact steps used:

cd
mkdir sundials_2.6.1
cd sundials_2.6.1
mkdir examples
cd ..
mkdir install
cd install
mkdir sundials-install
cd sundials-install
# Downloaded sundials-2.6.2.tar.gz to sundials-install
tar xzf sundials-2.6.2.tar.gz
mkdir build
cd build
cmake -DCMAKE_INSTALL_PREFIX=$HOME/sundials_2.6.1 \
-DEXAMPLES_INSTALL_PATH=$HOME/sundials_2.6.1/examples \
../sundials-2.6.1
make
make install

This worked like cream. I then downloaded petsc-3.4.5.tar.gz, and un-tared it into the $HOME

cd
cd petsc-3.4.5
./configure --with-sundials-dir=$HOME/sundials_2.6.1 --with-clanguage=cxx --with-mpi=yes --with-precision=double --with-scalar-type=real --with-shared-libraries=0 --with-debugging=yes --with-make-np=4

Usually google (or in worst case configure.log) are my friends, but in this case I'm having some trouble to see how to fix the problem. From the log file it looks like

/usr/bin/ld: cannot find -lsundials_nvecserial
/usr/bin/ld: cannot find -lsundials_nvecparallel

is causing the trouble.

Any suggestions?

d7919 commented 9 years ago

What is in ${HOME}/sundials_2.6.1/lib , does it contain those libraries (the static version [.a])?

If so have you tried doing

export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:${HOME}/sundials_2.6.1/lib

before building PETSc?

If the libraries are present there then you may need to expand on the defaults for the sundials build. I'd recommend using

ccmake ../sundials-2.6.1

instead of cmake in order to allow you to explore the available options more easily.

loeiten commented 9 years ago

Thanks for the help @d7919. It was indeed a problem with the libraries, but

export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:${HOME}/sundials_2.6.1/lib

didn't do. Instead I had to set the flag in cmake where the libraries would be pointed to.

I succeed building sundials-2.6.2 with

cmake \
-DCMAKE_INSTALL_PREFIX=$HOME/local \
-DEXAMPLES_INSTALL_PATH=$HOME/local/examples \
-DCMAKE_LINKER=$HOME/local/lib \
-DLAPACK_ENABLE=ON \
-DOPENMP_ENABLE=ON \
-DMPI_ENABLE=ON \
../sundials-2.6.2

(probably the DOPENMP_ENABLE=ON is not necessary, but I included it just in case). Building PETSc-3.4.5 succeded with

./configure --with-sundials-dir=$HOME/local --with-clanguage=cxx --with-mpi=yes --with-precision=double --with-scalar-type=real --with-shared-libraries=0 --with-debugging=yes

I didn't find the documentation for --with-make-np= anywhere, could this be dropped?

Configuring BOUT succeeded (?) with

./configure --with-petsc=$HOME/petsc-3.4.5 --with-sundials

however, the configure summary tells me that ARKode is still missing (although it successfully compiles, and even if I'm using --with-arkode=path/to/arkode) and that PETSc does not have sundials

  FACETS support: no
  PETSc support: yes (version 3.4, release = 1)
  PETSc has SUNDIALS support: no
  SLEPc support: no
  IDA support: yes
  CVODE support: yes
  ARKODE support: no
  NetCDF support: yes
  Parallel-NetCDF support: no
  HDF5 support: no (parallel: no)
  PDB support: no
  Hypre support: no
  MUMPS support: no

Also, when running test_petsc_laplace, make throws the following

test_petsc_laplace.cxx:191:20: error: ‘INVERT_AC_IN_GRAD’ was not declared in this scope
   invert->setFlags(INVERT_AC_IN_GRAD+INVERT_AC_OUT_GRAD);
                    ^
test_petsc_laplace.cxx:191:38: error: ‘INVERT_AC_OUT_GRAD’ was not declared in this scope
   invert->setFlags(INVERT_AC_IN_GRAD+INVERT_AC_OUT_GRAD);

However, running blob2d with boussineq=false works nicely.

d7919 commented 9 years ago

Excellent.

Yes --with-make-np= is not essential, it just allows you to use more processors when compiling PETSc in order to speed up the build process.

It's possible you need to tell sundials to specifically build arkode as well, otherwise it may not include it. It should be possible to just set --with-arkode, i.e. without the path.

loeiten commented 9 years ago

Indeed :)

It seems like --with-arkode didn't work either, but I guess the easiest way would maybe be to update configure.ac to make it search for ARKode in the same folders as IDA and CVODE.

So, if there are no objections, I'll update the manual once I get time, and close this thread with a commit.

johnomotani commented 9 years ago

Apparently test_petsc_laplace has not been updated since the Laplacian flags refactoring. It needs either INVERT_AC_IN_GRAD and INVERT_AC_OUT_GRAD replacing by the numerical values of the old flags, or invert->setFlags replacing with invert->setInnerBoundaryFlags and invert->setOuterBoundaryFlags (the flag being then INVERT_AC_GRAD for both).

Does PETSc actually need SUNDIALS support, the way we use it in BOUT++? It seems unlikely that we want to call SUNDIALS through PETSc, since we have a direct interface. I have forgotten if there ever was a reason I configured PETSc with SUNDIALS (does something in configure assume it should be if you link both PETSc and SUNDIALS?), but if it is not useful, then maybe we should avoid all these linking problems by recommending PETSc be built without external libraries.

d7919 commented 9 years ago

Whilst it's probably not essential to compile PETSc with SUNDIALS support the PETSc based solvers in BOUT do have some code specifically designed for use with the PETSc based SUNDIALS solver (which presumably allows a useful test, and possibly extends what can be done with SUNDIALS).

It is important to note that if you do configure PETSc with SUNDIALS then BOUT must use/link the same version of SUNDIALS or problems can occur (I think it should be possible to do something like --with-sundials=${PETSC_DIR}/${PETSC_ARCH}/lib or similar if PETSc has been configured using --download-sundials).

loeiten commented 9 years ago

Commit ff348fad72cf22ada00b30d38fdf777f2f1b1584 closes this