GridapPETSc
is a plugin of GridapDistributed.jl
that provides the full set of scalable linear and nonlinear solvers in the PETSc library. It also provides serial solvers to Gridap.jl
.
Take a look at this tutorial for learning how to use GridapPETSc
in distributed-memory simulations of PDEs.
It can also be used in the serial case, as shown in this test.
GridapPETSc
julia package requires the PETSC
library (Portable, Extensible Toolkit for Scientific Computation) and MPI
to work correctly. You have two main options to install these dependencies.
Do nothing [recommended in most cases]. Use the default precompiled MPI
installation provided by MPI.jl
and the pre-compiled PETSc
library provided by PETSc_jll
. This will happen under the hood when you install GridapPETSc
. In the case of GridapPETSc
, you can also force the installation of these default dependencies by setting the environment variable JULIA_PETSC_LIBRARY
to an empty value.
Choose a specific installation of MPI
and PETSc
available in the system [recommended in HPC clusters].
MPI
installation. See the documentation of MPI.jl
for further details.PETSc
installation. To this end, create an environment variable JULIA_PETSC_LIBRARY
containing the path to the dynamic library object of the PETSC
installation (i.e., the .so
file in linux systems). Very important: The chosen PETSc
library needs to be configured with the MPI
installation considered in the previous step.GridapPETSc
default sparse matrix format is 0-based compressed sparse row. This type of sparse matrix storage format can be described by the SparseMatrixCSR{0,PetscReal,PetscInt}
and SymSparseMatrixCSR{0,PetscReal,PetscInt}
Julia types as implemented in the SparseMatricesCSR Julia package.MPI.COMM_SELF
), GridapPETSc
implements a sort of limited garbage collector in order to automatically deallocate PETSc objects. This garbage collector can be manually triggered by a call to the function GridapPETSc.gridap_petsc_gc()
. GridapPETSc
automatically calls this function inside at different strategic points, and this will be sufficient for most applications. However, for some applications, with a very frequent allocation of PETSc objects, it might be needed to call this function from application code. This need will be signaled by PETSc via the following internal message error PETSC ERROR: No more room in array, limit 256 recompile src/sys/objects/destroy.c with larger value for MAXREGDESOBJS