Closed hermanmakhm closed 2 years ago
Could you update the example so that it doesn't require multiphenicsx
?
I can't test because I don't have multiphenicsx
installed, but from inspection I think you need to change
V = dolfinx.fem.FunctionSpace(mesh, ("Lagrange", 2))
u = ufl.TrialFunction(V)
v = ufl.TestFunction(V)
a = [[ufl.inner(u, v)*ufl.dx],
[ufl.inner(u, v)*ufl.dx]]
to
V0 = dolfinx.fem.FunctionSpace(mesh, ("Lagrange", 2))
V1 = V0.clone()
u = ufl.TrialFunction(V0)
v0, v1 = ufl.TrialFunction(V0), ufl.TestFunction(V1)
a = [[ufl.inner(u, v0)*ufl.dx],
[ufl.inner(u, v1)*ufl.dx]]
I have updated the example (with the mesh from this tutorial) and also implemented your suggestions as follows
import mpi4py.MPI as MPI
import ufl
import dolfinx.fem
import dolfinx.mesh
mesh = dolfinx.mesh.create_unit_square(MPI.COMM_WORLD, 8, 8, dolfinx.mesh.CellType.quadrilateral)
# Define a function space
V0 = dolfinx.fem.FunctionSpace(mesh, ("Lagrange", 2))
V1 = V0.clone()
u = ufl.TrialFunction(V0)
v0, v1 = ufl.TestFunction(V0), ufl.TestFunction(V1)
a = [[ufl.inner(u,v0) * ufl.dx],
[ufl.inner(u,v1) * ufl.dx]]
a_cpp = dolfinx.fem.form(a)
A = dolfinx.fem.petsc.assemble_matrix_block(a_cpp, bcs=[])
A.assemble()
print("Finished")
It again runs to completion with mpirun -np 1
and gives the following error with mpirun -np 2
Traceback (most recent call last):
File "/stck/hmmak/fenicsx-scripts/10deg_naca12/test_rectangular_assembly.py", line 18, in <module>
A = dolfinx.fem.petsc.assemble_matrix_block(a_cpp, bcs=[])
File "/stck/hmmak/tools/spack/var/spack/environments/2022-03-15-real-fenicsx/.spack-env/._view/c7ohj445pwtcbtlnhace2txzvkz7qrsg/lib/python3.9/functools.py", line 888, in wrapper
return dispatch(args[0].__class__)(*args, **kw)
File "/stck/hmmak/tools/spack/var/spack/environments/2022-03-15-real-fenicsx/.spack-env/view/lib/python3.9/site-packages/dolfinx/fem/petsc.py", line 383, in assemble_matrix_block
return assemble_matrix_block(A, a, bcs, diagonal, constants, coeffs)
File "/stck/hmmak/tools/spack/var/spack/environments/2022-03-15-real-fenicsx/.spack-env/._view/c7ohj445pwtcbtlnhace2txzvkz7qrsg/lib/python3.9/functools.py", line 888, in wrapper
return dispatch(args[0].__class__)(*args, **kw)
File "/stck/hmmak/tools/spack/var/spack/environments/2022-03-15-real-fenicsx/.spack-env/view/lib/python3.9/site-packages/dolfinx/fem/petsc.py", line 410, in _
A.assemble(PETSc.Mat.AssemblyType.FLUSH)
File "PETSc/Mat.pyx", line 1118, in petsc4py.PETSc.Mat.assemble
petsc4py.PETSc.Error: error code 63
[1] MatAssemblyEnd() at /tmp/hmmak/spack-stage/spack-stage-petsc-3.16.4-nhmtjugsebgrng3cgzdd5u5hiqdj7u2j/spack-src/src/mat/interface/matrix.c:5671
[1] MatAssemblyEnd_MPIAIJ() at /tmp/hmmak/spack-stage/spack-stage-petsc-3.16.4-nhmtjugsebgrng3cgzdd5u5hiqdj7u2j/spack-src/src/mat/impls/aij/mpi/mpiaij.c:707
[1] MatSetValues_MPIAIJ() at /tmp/hmmak/spack-stage/spack-stage-petsc-3.16.4-nhmtjugsebgrng3cgzdd5u5hiqdj7u2j/spack-src/src/mat/impls/aij/mpi/mpiaij.c:482
[1] Argument out of range
[1] Column too large: col 353 max 288
@hermanmakhm thanks for reporting - should be fixed now.
FEniCSx: Last pull from
main
on 2022-03-15 via spackI attempted to assemble a rectangular matrix which worked in serial (
mpirun -np 1 python *.py
) but didn't work when I attempted to run the same code in parallel (mpirun -np 2 python *.py
). Here is the sample code to reproduce the problem:and below is the error message associated with running the code in parallel