gridap / GridapP4est.jl

MIT License
9 stars 1 forks source link

Tests failing with p4est compiled in debug mode #29

Open amartinhuertas opened 1 year ago

amartinhuertas commented 1 year ago

When running the tests in my local machine in the status corresponding to commit cdcda1e35eb85a21f86c229a24f12faf8f7019a6 + P4est compiled in debug mode, I obtain the error below on screen, due to a failing internal assertion. I note that this assert is not compiled in release mode, thus this may explain why we do not get the error when compiling p4est in release mode, as in, e.g., Github actions.

     Testing Running tests...
WARNING: Method definition num_parts(MPI.Comm) in module PartitionedArrays at /home/amartin/.julia/packages/PartitionedArrays/g9b74/src/MPIBackend.jl:3 overwritten in module GridapP4est at /home/amartin/git-repos/GridapP4est.jl/src/PartitionedArraysExtensions.jl:1.
  ** incremental compilation may be fatally broken for this module **

WARNING: Method definition get_part_id(MPI.Comm) in module PartitionedArrays at /home/amartin/.julia/packages/PartitionedArrays/g9b74/src/MPIBackend.jl:2 overwritten in module GridapP4est at /home/amartin/git-repos/GridapP4est.jl/src/PartitionedArraysExtensions.jl:10.
  ** incremental compilation may be fatally broken for this module **

cmd = `/home/amartin/software_installers/openmpi-3.1.2-install-gnu9/bin/mpiexec -n 6 --allow-run-as-root --oversubscribe /home/amartin/software_installers/julia-1.7.2/bin/julia -Cnative -J/home/amartin/software_installers/julia-1.7.2/lib/julia/sys.so --depwarn=yes --check-bounds=yes -g1 --color=yes --startup-file=no --project=. /home/amartin/git-repos/GridapP4est.jl/test/OctreeDistributedDiscreteModelsTests.jl`
[libsc] This is libsc 2.2
[libsc] CPP                      mpicc -E
[libsc] CPPFLAGS                 
[libsc] CC                       mpicc
[libsc] CFLAGS                   -g -O2
[libsc] LDFLAGS                  
[libsc] LIBS                       -lz -lm   
[libsc] Shared memory node communicator size: 4
[p4est] This is p4est 2.2
[p4est] CPP                      mpicc -E
[p4est] CPPFLAGS                 
[p4est] CC                       mpicc
[p4est] CFLAGS                   -g -O2
[p4est] LDFLAGS                  
[p4est] LIBS                       -lz -lm   
[p4est 1]  first tree 2 first quadrant 0 global quadrant 8
[p4est 1]  last tree 3 last quadrant 3 global quadrant 15
[p4est 1]  total local quadrants 8
[p4est] Into p4est_new with min quadrants 0 level 1 uniform 1
[p4est]  New p4est with 4 trees on 2 processors
[p4est]  Initial level 1 potential global quadrants 16 per tree 4
[p4est 0]  first tree 0 first quadrant 0 global quadrant 0
[p4est 0]  last tree 1 last quadrant 3 global quadrant 7
[p4est 0]  total local quadrants 8
[p4est] Done p4est_new with 16 total quadrants
[p4est] Into p4est_ghost_new CORNER
[p4est 1]  ghost layer post count receive from 0
[p4est 1]  ghost layer post count send 4 to 0
[p4est 0]  ghost layer post count receive from 1
[p4est 0]  ghost layer post count send 4 to 1
[p4est 1]  Total quadrants skipped 2 ghosts to receive 4
[p4est 1]  ghost layer post ghost receive 4 quadrants from 0
[p4est 1]  ghost layer post ghost send 4 quadrants to 0
[p4est 0]  Total quadrants skipped 2 ghosts to receive 4
[p4est 0]  ghost layer post ghost receive 4 quadrants from 1
[p4est 0]  ghost layer post ghost send 4 quadrants to 1
[p4est] Done p4est_ghost_new
[p4est] Into p4est_lnodes_new, degree 1
[p4est 0]  Total of 20 bytes sent to 1 processes
[p4est 0]  Total of 0 bytes received from 0 processes
[p4est 1]  Total of 0 bytes sent to 0 processes
[p4est 1]  Total of 20 bytes received from 1 processes
[p4est 0]  Processor 0 shares 5 nodes with processor 1
[p4est 0]  Processor 0 owns 5 nodes used by processor 1
[p4est 1]  Processor 1 shares 5 nodes with processor 0
[p4est 1]  Processor 1 owns 0 nodes used by processor 0
[p4est 1]  Processor 1 borrows 5 nodes from processor 0
[p4est 0]  Processor 0 borrows 0 nodes from processor 1
[p4est]  Statistics for   Nodes per processor
[p4est]     Global number of values:       2
[p4est]     Mean value (std. dev.):           12.5 (2.5 = 20%)
[p4est]     Minimum attained at rank       1: 10
[p4est]     Maximum attained at rank       0: 15
[p4est] Done p4est_lnodes_new with 25 global nodes
[p4est 1]  Into refine tree 2 with 4
[p4est 1]  Done refine tree 2 now 16
[p4est 1]  Into refine tree 3 with 4
[p4est 1]  Done refine tree 3 now 16
[p4est] Into p4est_refine with 16 total quadrants, allowed level 29
[p4est 0]  Into refine tree 0 with 4
[p4est 0]  Done refine tree 0 now 16
[p4est 0]  Into refine tree 1 with 4
[p4est 0]  Done refine tree 1 now 16
[p4est] Done p4est_refine with 64 total quadrants
[p4est] Into p4est_inflate
[libsc 2] Abort: Assertion 'quadrants->elem_size == sizeof (p4est_qcoord_t)'
[libsc 2] Abort: src/p4est_io.c:118
[libsc 3] Abort: Assertion 'quadrants->elem_size == sizeof (p4est_qcoord_t)'
[libsc 3] Abort: src/p4est_io.c:118
[libsc 3] Abort
[libsc 2] Abort
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 3 in communicator MPI COMMUNICATOR 4 SPLIT FROM 3
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

signal (15): Terminated
in expression starting at /home/amartin/git-repos/GridapP4est.jl/test/OctreeDistributedDiscreteModelsTests.jl:78

signal (15): Terminated
in expression starting at /home/amartin/git-repos/GridapP4est.jl/test/OctreeDistributedDiscreteModelsTests.jl:78

signal (15): Terminated
in expression starting at /home/amartin/git-repos/GridapP4est.jl/test/OctreeDistributedDiscreteModelsTests.jl:79

signal (15): Terminated
in expression starting at /home/amartin/git-repos/GridapP4est.jl/test/OctreeDistributedDiscreteModelsTests.jl:79
[sistemas-ThinkPad-X1-Carbon-6th:12710] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[sistemas-ThinkPad-X1-Carbon-6th:12710] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
OctreeDistributedDiscreteModelsTests.jl: Error During Test at /home/amartin/git-repos/GridapP4est.jl/test/runtests.jl:32
  Got exception outside of a @test
  failed process: Process(`/home/amartin/software_installers/openmpi-3.1.2-install-gnu9/bin/mpiexec -n 6 --allow-run-as-root --oversubscribe /home/amartin/software_installers/julia-1.7.2/bin/julia -Cnative -J/home/amartin/software_installers/julia-1.7.2/lib/julia/sys.so --depwarn=yes --check-bounds=yes -g1 --color=yes --startup-file=no --project=. /home/amartin/git-repos/GridapP4est.jl/test/OctreeDistributedDiscreteModelsTests.jl`, ProcessExited(1)) [1]

  Stacktrace:
    [1] pipeline_error
      @ ./process.jl:531 [inlined]
    [2] run(::Cmd; wait::Bool)
      @ Base ./process.jl:446
    [3] run
      @ ./process.jl:444 [inlined]
    [4] (::Main.GridapP4estTests.var"#1#3"{String, Int64, Bool, String, String})(cmd::Cmd)
      @ Main.GridapP4estTests ~/git-repos/GridapP4est.jl/test/runtests.jl:50
    [5] (::MPI.var"#28#29"{Main.GridapP4estTests.var"#1#3"{String, Int64, Bool, String, String}})(cmd::Cmd)
      @ MPI ~/.julia/packages/MPI/08SPr/src/environment.jl:25
    [6] _mpiexec
      @ ~/.julia/packages/MPI/08SPr/deps/deps.jl:6 [inlined]
    [7] mpiexec(fn::Main.GridapP4estTests.var"#1#3"{String, Int64, Bool, String, String})
      @ MPI ~/.julia/packages/MPI/08SPr/src/environment.jl:25
    [8] macro expansion
      @ ~/git-repos/GridapP4est.jl/test/runtests.jl:33 [inlined]
    [9] macro expansion
      @ ~/software_installers/julia-1.7.2/share/julia/stdlib/v1.7/Test/src/Test.jl:1359 [inlined]
   [10] macro expansion
      @ ./timing.jl:220 [inlined]
   [11] run_tests(testdir::String)
      @ Main.GridapP4estTests ~/git-repos/GridapP4est.jl/test/runtests.jl:32
   [12] top-level scope
      @ ~/git-repos/GridapP4est.jl/test/runtests.jl:56
   [13] include(fname::String)
      @ Base.MainInclude ./client.jl:451
   [14] top-level scope
      @ none:6
   [15] eval
      @ ./boot.jl:373 [inlined]
   [16] exec_options(opts::Base.JLOptions)
      @ Base ./client.jl:268
   [17] _start()
      @ Base ./client.jl:495
Test Summary:                           | Error  Total
OctreeDistributedDiscreteModelsTests.jl |     1      1
Test Summary:                           | Error  Total
OctreeDistributedDiscreteModelsTests.jl |     1      1
ERROR: LoadError: Some tests did not pass: 0 passed, 0 failed, 1 errored, 0 broken.
in expression starting at /home/amartin/git-repos/GridapP4est.jl/test/runtests.jl:1

caused by: Some tests did not pass: 0 passed, 0 failed, 1 errored, 0 broken.
ERROR: Package GridapP4est errored during testing