Closed GoogleCodeExporter closed 9 years ago
In feel++, you have 2 sizes : nLocalDofWithoutGhost and nLocalDofWithGhost. For
Petsc Matrix and Vector in //, you must give nLocalDofWithoutGhost() for the
localSize
and also a map which take into account the ghosts dofs. Also, in //, it's very
important
to use newVector(Xh) and newMatrix(Xh,Xh), others constructors don't work in
parallel yet.
If you use the good constructor, you can send me a simple exemple.
Vincent.
Original comment by vincent....@gmail.com
on 7 Aug 2012 at 10:04
I ran feel_doc_laplacian on two procs.
The total number of dofs is 65
At the boost parameter functions newMatrix from backend.hpp, i have :
trial->nLocalDofWithoutGhosts() : 36 and test->nLocalDofWithoutGhosts : 36 (
proc 0 )
trial->nLocalDofWithoutGhosts() : 29 and test->nLocalDofWithoutGhosts : 29 (
proc 1 )
the i have 65 dofs in total, everything is ok unlike that i said yesterday (
sorry )
But concerning the element_type u (solution) i have :
u.localSize() : 36 on each procs, but I expected to have 29 on proc 1 ( isn't
it ? )
and so, if you add following line, after the problem is solved :
D->energy( u , u );
then there is a problem when MatMult is called inside the energy function, and
i get the following error message :
[1]PETSC ERROR: --------------------- Error Message
------------------------------------
[1]PETSC ERROR: Nonconforming object sizes!
[1]PETSC ERROR: Mat mat,Vec y: local dim 29 36!
[1]PETSC ERROR:
------------------------------------------------------------------------
[1]PETSC ERROR: Petsc Release Version 3.2.0, Patch 5, Sat Oct 29 13:45:54 CDT
2011
By the way i noticed that it's not a good idea to compile feel_doc_laplacian
with clang, the compilation is ok but then it crashes whereas everything is ok
whith gcc-4.5 or gcc-4.6
Original comment by stephane...@gmail.com
on 8 Aug 2012 at 7:48
energy( ) funtion in matrixpetsc.hpp is not //.
you try to use MatMult with a parallel matrix and a sequential vector.
To fix that you should implement energy function in MatrixPetscMPI
and used VectorPetscMPI !! (not VectorPetsc which work only in sequential and
idem for MatrixPetsc and MatrixPetscMPI).
To build properly a VectorPetscMPI, you should use this constructor
VectorPetscMPI( DataMap const& dm )
with dm = this->mapRow() ( keyword "this" is a MatrixPetscMPI).
Original comment by vincent....@gmail.com
on 8 Aug 2012 at 8:32
Thanks Vincent for your advices
I added energy function implementation in parallel but unfortunately I don't
obtain same result than in sequential so I still have a bug to find ...
Original comment by stephane...@gmail.com
on 9 Aug 2012 at 7:00
should this bug be closed now ?
Original comment by christop...@feelpp.org
on 10 Aug 2012 at 11:10
yes
Original comment by stephane...@gmail.com
on 10 Aug 2012 at 5:01
Original issue reported on code.google.com by
stephane...@gmail.com
on 7 Aug 2012 at 8:32