Closed kks32 closed 3 years ago
Merging #721 (2531ed9) into develop (087bd86) will decrease coverage by
0.02%
. The diff coverage is66.67%
.
@@ Coverage Diff @@
## develop #721 +/- ##
===========================================
- Coverage 96.77% 96.75% -0.02%
===========================================
Files 130 130
Lines 25920 25932 +12
===========================================
+ Hits 25083 25090 +7
- Misses 837 842 +5
Impacted Files | Coverage Δ | |
---|---|---|
include/mesh.tcc | 82.77% <50.00%> (-0.22%) |
:arrow_down: |
include/solvers/mpm_explicit.tcc | 93.24% <77.78%> (-2.34%) |
:arrow_down: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 087bd86...2531ed9. Read the comment docs.
Describe the PR When using a manually written particles HDF5 files and resuming from step 0, it will typically call
mesh_->resume_domain_cell_ranks()
which then assigns the current MPI_RANK to the cell if the cell has particles in it. This works on the assumption that the current cell has particles only on the current rank - which is true when generated using our MPM code, but not for particles generated manually and written to HDF5 file. When particles are distributed across different MPI ranks for the same cell, MPI_Reduce will result in MPI_RANK larger than the MPI_SIZE (bad!) - assigning to a different MPI rank as long as it's within MPI_SIZE is bad for efficiency, which will be resolved in the subsequent step.This PR adds a boolean flag called
repartition: true
toresume
JSON to enable repartitioning when resuming from a HDF5 file. This is an optional flag which is set to false.Additional context Input file change: