usnistgov / fipy

FiPy is a Finite Volume PDE solver written in Python
http://pages.nist.gov/fipy/en/latest
Other
517 stars 150 forks source link

PETSc can't solve 6th-order coupled PDE #840

Open guyer opened 2 years ago

guyer commented 2 years ago

The system of equations

n_eq = (fp.TransientTerm(coeff=1, var=n) == fp.DiffusionTerm(coeff=1, var=mu))
mu_eq = (fp.ImplicitSourceTerm(coeff=1, var=mu)
           == fp.ImplicitSourceTerm(coeff=1 + n**2, var=n)
           + fp.DiffusionTerm(coeff=1, var=xi))
xi_eq = (fp.ImplicitSourceTerm(coeff=1, var=xi)
         == fp.ImplicitSourceTerm(coeff=2, var=n)
         + fp.DiffusionTerm(coeff=1, var=n))

eq = n_eq & mu_eq & xi_eq

raises an Exception with PETSc

Traceback (most recent call last):
  File "/Users/guyer/Documents/research/FiPy/phase_field_crystal/BlockNonDiagonal.py", line 117, in <module>
    eq.solve(dt=1)
  File "/Users/guyer/Documents/research/FiPy/fipy/fipy/terms/term.py", line 178, in solve
    solver._solve()
  File "/Users/guyer/Documents/research/FiPy/fipy/fipy/solvers/petsc/petscSolver.py", line 59, in _solve
    self._solve_(globalMatrix.matrix, 
  File "/Users/guyer/Documents/research/FiPy/fipy/fipy/solvers/petsc/petscKrylovSolver.py", line 64, in _solve_
    ksp.solve(b, x)
  File "PETSc/KSP.pyx", line 397, in petsc4py.PETSc.KSP.solve
petsc4py.PETSc.Error: error code 73
[0] KSPSolve() at /Users/runner/miniforge3/conda-bld/petsc_1637321992139/work/src/ksp/ksp/interface/itfunc.c:1086
[0] KSPSolve_Private() at /Users/runner/miniforge3/conda-bld/petsc_1637321992139/work/src/ksp/ksp/interface/itfunc.c:852
[0] KSPSetUp() at /Users/runner/miniforge3/conda-bld/petsc_1637321992139/work/src/ksp/ksp/interface/itfunc.c:408
[0] PCSetUp() at /Users/runner/miniforge3/conda-bld/petsc_1637321992139/work/src/ksp/pc/interface/precon.c:1016
[0] PCSetUp_ILU() at /Users/runner/miniforge3/conda-bld/petsc_1637321992139/work/src/ksp/pc/impls/factor/ilu/ilu.c:144
[0] MatILUFactorSymbolic() at /Users/runner/miniforge3/conda-bld/petsc_1637321992139/work/src/mat/interface/matrix.c:6943
[0] MatILUFactorSymbolic_SeqAIJ() at /Users/runner/miniforge3/conda-bld/petsc_1637321992139/work/src/mat/impls/aij/seq/aijfact.c:1697
[0] Object is in wrong state
[0] Matrix is missing diagonal entry 6

The issue is that the matrix block structure is not block-diagonal. For a 3-element Grid1D:

 1.000000      ---        ---        ---        ---        ---     1.000000  -1.000000      ---    
    ---     1.000000      ---        ---        ---        ---    -1.000000   2.000000  -1.000000  
    ---        ---     1.000000      ---        ---        ---        ---    -1.000000   1.000000  
    ---        ---        ---     1.000000  -1.000000      ---     1.000000      ---        ---    
    ---        ---        ---    -1.000000   2.000000  -1.000000      ---     1.000000      ---    
    ---        ---        ---        ---    -1.000000   1.000000      ---        ---     1.000000  
 1.000000  -1.000000      ---     1.000000      ---        ---        ---        ---        ---    
-1.000000   2.000000  -1.000000      ---     1.000000      ---        ---        ---        ---    
    ---    -1.000000   1.000000      ---        ---     1.000000      ---        ---        ---    

Trilinos GMRES has no problem. Neither do SciPy or PySparse, but they invoke an LU solver, which diagonalizes automatically, I think.

guyer commented 2 years ago

See example notebook

guyer commented 2 years ago

Need to run PETSc with -help to see if there's an option that accommodates this.

guyer commented 2 years ago

Why isn't FiPy generating a block-diagonal matrix, though?

guyer commented 2 years ago

Do we precondition differently between Trilinos and PETSc?

codepath1 commented 4 months ago

Hi guyer! This seems to be due to the lack of pre allocated space when storing sparse matrices. Try: A.setOption(PETSc.Mat.Option.NEW_NONZERO_ALLOCATION_ERR,False) A.assemblyBegin() for n in range(6,8): A[n,n]=0 A.assemblyEnd()

guyer commented 4 months ago

@codepath1 Thank you for the suggestion. I'll have to look into it, although I'm surprised it's necessary. Generally speaking, in order to do this, we'd need some way to know that the diagonal hasn't been set. Also, I don't think FiPy should be generating this block structure in the first place, rather I'd expect:

XX
 XX
X X