thegooglecodearchive / sfepy

Automatically exported from code.google.com/p/sfepy
0 stars 0 forks source link

petsc (parallel) solvers #68

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
Lisandro Dalcin has sent us a nice example of solving a sparse system with
a scipy.sparse matrix by petsc4py - let's interface petsc solvers.

Original issue reported on code.google.com by robert.c...@gmail.com on 8 Dec 2008 at 1:31

GoogleCodeExporter commented 9 years ago
Could you please point me to that example?

Original comment by ondrej.c...@gmail.com on 8 Dec 2008 at 3:57

GoogleCodeExporter commented 9 years ago
The example is attached now.

Original comment by robert.c...@gmail.com on 9 Dec 2008 at 8:41

Attachments:

GoogleCodeExporter commented 9 years ago
Finally, I have interfaced the PETSc Krylov solvers thanks to petsc4py (git 
commit
650b562). Check out the solution time below, obtained by
tests/test_linear_solvers.py. The PETSc conjugate gradient solver + incomplete
Cholesky preconditioner rock for the Laplacian!

It remains to try it for larger/more complex problems (to be done as the 
problems
arrive), and make sfepy parallel. This is going to be another issue, though. So 
let's
close this one.

equations = {
    'Temperature' : """dw_laplace.i1.Omega( coef.val, s, t ) = 0"""
}
...   matrix size: (9511, 9511)
...           nnz: 132239
...   solution times (rezidual norms):
...   0.17 [s] (2.428e-12) : ls120 ls.petsc cg icc
...   0.35 [s] (3.301e-12) : ls110 ls.scipy_iterative cg
...   0.51 [s] (6.537e-12) : ls111 ls.scipy_iterative bicgstab
...   1.01 [s] (3.571e-12) : ls112 ls.scipy_iterative qmr
...   3.77 [s] (2.718e-12) : ls101 ls.pyamg ruge_stuben_solver
...   4.36 [s] (1.760e-12) : ls102 ls.pyamg smoothed_aggregation_solver
...   6.00 [s] (4.870e-15) : dls100 ls.umfpack

Original comment by robert.c...@gmail.com on 15 Jan 2009 at 10:39

GoogleCodeExporter commented 9 years ago
Great job!

So petsc is 2x faster than scipy? So why umfpack totally sucks on this 
particular
problem?

Original comment by ondrej.c...@gmail.com on 15 Jan 2009 at 9:06

GoogleCodeExporter commented 9 years ago
IMHO it's a very simple problem - the conjugate gradients cannot be beaten for 
the
Laplacian by a direct solver. Anyway, the scipy's CG is not preconditioned, 
while the
PETSc CG is (by incomplete Cholesky). Otherwise I would say they would perform
similarly for such a small and simple problem.

To conclude - do not use umfpack for Poisson-like problems on simple geometries 
as CG
is the winner in such a case. For other problems we have now quite a bunch of 
solvers
to try out and see which one performs the best - I would bet on PETSc Krylov 
solvers
with a proper preconditioning might outperform umfpack for other problems 
besides the
Poisson one too.

Original comment by robert.c...@gmail.com on 16 Jan 2009 at 11:26

GoogleCodeExporter commented 9 years ago
For large Poisson-like problems, let say more than 50/500k unknowns, you could 
try 
algebraic multigrid methods from Trillinos/ML and HYPRE BoomerAMG. This 
requires 
PETSc to be built with such packages (basically, pass --with-ml=1 
--download-ml=1 
and/or --with-hypre=1 --download-hypre=1). 

In my (really short) experience with Trillinos/ML and HYPRE BoomerAMG, the fist 
one 
(ML)is faster to set-up but slower for solving; while the second (BoomerAMG) is 
the 
other way around, i.e., slower to set-up but faster for solving. So the choice 
could 
be done depending on the problems being transient or not (provided that the 
code 
reuses the matrices and solvers structures).

Original comment by dalcinl on 19 Jan 2009 at 12:11

GoogleCodeExporter commented 9 years ago
Hi every one, 
I have a school project in which I'm require to use hypre library. I went 
through
user manual and reference for hypre, but couldn't get much of those. Kindly any 
one
there who has used Hypre. I'm dealing with solution of linear problems of form 
Ax =
b. A and b are spares matrices. I have few question, I'll be very much grateful 
for
answers.
1 - What is the best way to read in the data files. 
2 - what other matrix format Hypre can easily support. 
3 - I want to use Hypre PCG, what could be an optimal number of iterations. 
I'm looking forward for any kind of help. 
Best Regards. 
Nesh

Original comment by naveedat...@gmail.com on 2 Jun 2009 at 11:13

GoogleCodeExporter commented 9 years ago
Hi, please send your email to the list, not to the issues.

Original comment by ondrej.c...@gmail.com on 3 Jun 2009 at 1:23

GoogleCodeExporter commented 9 years ago
Also, we do not use Hypre in SfePy yet; it is a good library to support, 
though. BTW.
to join/post to our mailing list, go to the main page 
http://code.google.com/p/sfepy/
and click on sfepy-devel. You need a google account to log in.

Original comment by robert.c...@gmail.com on 3 Jun 2009 at 8:05

GoogleCodeExporter commented 9 years ago
Migrated to http://github.com/sfepy/sfepy/issues/72

Original comment by robert.c...@gmail.com on 30 Jan 2012 at 10:25