ARudik / feelpp

Automatically exported from code.google.com/p/feelpp
0 stars 0 forks source link

MPI errors with nlSolve function #10

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
Issue using nlSolve (Newton algorithm with jacobian and residual parameters) :
M_backend->nlSolve(_jacobian=J, _solution=U_vec, _residual=R);

Even though the code is not compiled with mpi flag, execution gives the 
following error :

[euler:15691] *** An error occurred in MPI_Attr_get
[euler:15691] *** on a NULL communicator
[euler:15691] *** Unknown error
[euler:15691] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)

Original issue reported on code.google.com by daversin...@gmail.com on 21 Mar 2012 at 9:50

GoogleCodeExporter commented 9 years ago
I got qlso a similar issue 

*** The MPI_Attr_get() function was called after MPI_FINALIZE was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
[sd-25154:20123] Abort after MPI_FINALIZE completed successfully; not able to 
guarantee that all other processes were killed!

Original comment by diamal...@gmail.com on 27 Mar 2012 at 12:06

GoogleCodeExporter commented 9 years ago
This issue was updated by revision 6953bfd2228c.

register the singleton object that manipulate objects dealing with MPI so that
they are destroy when the environment is also deleted. We then get rid of the
messages from MPI complaining that MPI_Finalize() has been already called when
the objects are themselves being destroyed

git-svn-id: 
svn+ssh://forge.imag.fr/var/lib/gforge/chroot/scmrepos/svn/life/trunk/life/trunk
@8899 18f2cc81-8059-4896-b63e-f2d89ec8fd72

Original comment by christop...@feelpp.org on 2 May 2012 at 3:33

GoogleCodeExporter commented 9 years ago
This issue was closed by revision 65a76c199bfd.

Original comment by christop...@feelpp.org on 2 May 2012 at 3:33

GoogleCodeExporter commented 9 years ago
This issue was updated by revision f0fb749d2f84.

register the singleton object that manipulate objects dealing with MPI so that
they are destroy when the environment is also deleted. We then get rid of the
messages from MPI complaining that MPI_Finalize() has been already called when
the objects are themselves being destroyed

Original comment by christop...@feelpp.org on 3 May 2012 at 7:42

GoogleCodeExporter commented 9 years ago
This issue was closed by revision 8b473fdc2bd8.

Original comment by christop...@feelpp.org on 3 May 2012 at 7:42

GoogleCodeExporter commented 9 years ago
This issue was updated by revision 8af7a2332852.

clear()  was triggered after PetscFinalize() has been called
first check that Petsc has been initialized

Original comment by christop...@feelpp.org on 22 May 2012 at 10:47

GoogleCodeExporter commented 9 years ago
This issue was updated by revision 407e6a297209.

clear()  was triggered after PetscFinalize() has been called
first check that Petsc has been initialized

git-svn-id: 
svn+ssh://forge.imag.fr/var/lib/gforge/chroot/scmrepos/svn/life/trunk/life/trunk
@9037 18f2cc81-8059-4896-b63e-f2d89ec8fd72

Original comment by christop...@feelpp.org on 22 May 2012 at 11:06