jorgensd / mesh_converter

A mesh converter from EXODUS 2 to XDMF
MIT License
4 stars 0 forks source link

PETSc segfault when loading facet meshtags #4

Closed drtloudon closed 5 months ago

drtloudon commented 5 months ago

I've created a simple exodus tetrahedral mesh of the domain (-1, +1)^3 with cell blocks and facet blocks. I'm able to successfully convert the mesh to .xdmf format using mesh_converter. However, when I try and read the facet meshtags into dolfinx I get a PETSc seg fault error. The following lines of code reproduce the error:

from dolfinx import io
from mpi4py import MPI
xfile = io.XDMFFile(MPI.COMM_WORLD, 'unit_cube_tet.xdmf', 'r')
mesh = xfile.read_mesh(name='Mesh')
block_data = xfile.read_meshtags(mesh, name='Mesh')
fmesh = xfile.read_mesh(name='Facet_Mesh')
bdry_data = xfile.read_meshtags(fmesh, name='Facet_Mesh') #Problem

The above runs fine with the last line commented. The error message is:

[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is causing the crash.
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

The .xdmf file looks perfect in Paraview and correctly shows the cell blocks and facet blocks. Any help is appreciated. My goal is to use the bdry_data meshtags to apply boundary conditions in dolfinx.

jorgensd commented 5 months ago

I would need the xdmf and h5 file to diagnose the issue.

drtloudon commented 5 months ago

I'm able to recreate the error with the mesh I emailed you yesterday, but I'll also send you this particular file. Thanks.

jorgensd commented 5 months ago

You are reading the the facet tags in the wrong way. Here is an MWE:

from dolfinx import io
import dolfinx
from mpi4py import MPI
xfile = io.XDMFFile(MPI.COMM_WORLD, 'quarter_shell_tet.xdmf', 'r')
mesh = xfile.read_mesh(name='Mesh')

block_data = xfile.read_meshtags(mesh, name='Mesh')
mesh.topology.create_connectivity(mesh.topology.dim-1, mesh.topology.dim)
bdry_data = xfile.read_meshtags(mesh, name='Facet_Mesh')  # Problem
xfile.close()