Open guyer opened 2 years ago
See example notebook
Need to run PETSc with -help
to see if there's an option that accommodates this.
Why isn't FiPy generating a block-diagonal matrix, though?
Do we precondition differently between Trilinos and PETSc?
Hi guyer! This seems to be due to the lack of pre allocated space when storing sparse matrices. Try: A.setOption(PETSc.Mat.Option.NEW_NONZERO_ALLOCATION_ERR,False) A.assemblyBegin() for n in range(6,8): A[n,n]=0 A.assemblyEnd()
@codepath1 Thank you for the suggestion. I'll have to look into it, although I'm surprised it's necessary. Generally speaking, in order to do this, we'd need some way to know that the diagonal hasn't been set. Also, I don't think FiPy should be generating this block structure in the first place, rather I'd expect:
XX
XX
X X
The system of equations
raises an
Exception
with PETScThe issue is that the matrix block structure is not block-diagonal. For a 3-element
Grid1D
:Trilinos GMRES has no problem. Neither do SciPy or PySparse, but they invoke an LU solver, which diagonalizes automatically, I think.