Open mangerij opened 8 years ago
Hangs? In serial or parallel? It could also be the case that it's ultra slow. How long did you let it sit? In either case thanks for the test case. On Thu, Mar 24, 2016 at 6:43 PM John notifications@github.com wrote:
Description of the enhancement or error report
No error message thrown when using a certain quasi-periodic mesh with tetrahedrons. Usually we get thrown a seg fault 11 or an error message about no boundary found from the translation vector. This just seems to hang, no errors. If I use scheme sweep with hex8 elements, the input file runs without a problem. Rationale for the enhancement or information for reproducing the error
Using the following mesh recipe in cubit
reset create brick x 90 y 150 z 60 volume 1 move 45 0 0 create cylinder radius 50 height 60 volume 2 move 0 -12 0 subtract volume 2 from volume 1 create brick x 90 y 150 z 60 create cylinder radius 50 height 60 volume 4 move 0 -12 0 volume 3 move -45 0 0 subtract volume 4 from volume 3 create cylinder radius 50 height 60 volume 5 move 0 -12 0 create brick x 90 y 180 z 60 volume 6 move 45 0 0 subtract volume 6 from volume 5 create cylinder radius 50 height 60 volume 7 move 0 -12 0 create brick x 90 y 180 z 60 volume 8 move -45 0 0 subtract volume 8 from volume 7 create brick x 180 y 65 z 60 volume 9 move 0 -90 0 subtract volume 9 from volume 1 3 5 7
compress all merge surface all with surface all
volume 3 4 size 3.0 volume 1 2 size 3.0 surface 8 11 size 3.0 surface 2 size 3.0 surface 6 14 size 28.0
volume 1 2 3 4 scheme tetmesh autosmooth target off mesh volume 1 2 3 4
block 1 volume 3 4 block 2 volume 1 2
sideset 1 surface 8 11 sideset 2 surface 16 24 sideset 3 surface 19 22 sideset 4 surface 5 10 15 20 sideset 5 surface 7 12 sideset 6 surface 9 13 sideset 7 surface 1 3 sideset 8 surface 15 20
block all element type tetra4 set large exodus file off export Genesis "./exodus_wire.e" dimension 3 block all overwrite
and the following minimal boundary condition
[./Periodic] [./u_pbc] variable = u primary = '5' secondary = '6' translation = '0 0 60' [../] [../]
along with any kernel (I think) will reproduce the problem. The mesh on either side of the periodic boundary conditions seems to be acceptable. Note that using scheme sweep works with the same input file so this is definitely just an issue with tetmesh. Identified impact
Seems to be a bug that end users will see when using a common type of mesh with a fairly simple boundary condition -- having no error message is inconvenient to say the least.
— You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub https://github.com/idaholab/moose/issues/6643
I only checked in parallel. I let it run/hang for at least 20 mins. The parallel hex8 problem runs almost immediately.
Ok Thanks for the information. BTW we have a script checked in that can be used to find diverging processes (usually the reason for your hang). If you are interested I can share more info with you on how to use it. On Thu, Mar 24, 2016 at 10:31 PM John notifications@github.com wrote:
I only checked in parallel. I let it run/hang for at least 20 mins. The parallel hex8 problem runs almost immediately.
— You are receiving this because you commented.
Reply to this email directly or view it on GitHub https://github.com/idaholab/moose/issues/6643#issuecomment-201131636
Any change you can make it a little smaller? If not, that's OK, but this is a fairly large mesh for debugging.
Well if I make the mesh coarser just by changing the size, this shows that the mesh on the periodic surfaces doesn't match. I imagine this is why the problem is hanging.
It should fail with a message either way. Not hang. On Fri, Mar 25, 2016 at 5:14 PM John notifications@github.com wrote:
Well if I make the mesh coarser just by changing the size, this shows that the mesh on the periodic surfaces doesn't match. I imagine this is why the problem is hanging.
— You are receiving this because you commented.
Reply to this email directly or view it on GitHub https://github.com/idaholab/moose/issues/6643#issuecomment-201576306
I ran this problem in debug mode and it fails immediately with tons of these errors:
0] src/fe/fe_map.C, line 1656, compiled nodate at notime
WARNING: inverse_map of physical point (x,y,z)=( 3.09212, 43.431, 30) is not on element.
Elem Information
id()=175028, processor_id()=0
type()=TET4
dim()=3
n_nodes()=4
0 Node id()=33775, processor_id()=0, Point=(x,y,z)=( 2.75098, 43.5945, 30)
DoFs=(0/0/33775)
1 Node id()=33778, processor_id()=0, Point=(x,y,z)=( 0, 44.1667, 30)
DoFs=(0/0/33778)
2 Node id()=34290, processor_id()=0, Point=(x,y,z)=( 2.73002, 47.2348, 30)
DoFs=(0/0/34290)
3 Node id()=33781, processor_id()=0, Point=(x,y,z)=( 0, 45.7017, 27.3364)
DoFs=(0/0/33781)
n_sides()=4
neighbor(0)=NULL
neighbor(1)=173871
neighbor(2)=174950
neighbor(3)=174764
hmin()=2.80986, hmax()=4.37071
volume()=4.44051
active()=1, ancestor()=0, subactive()=0, has_children()=0
parent()=NULL
level()=0, p_level()=0
refinement_flag()=DO_NOTHING
p_refinement_flag()=DO_NOTHING
DoFs=
I talked to @jwpeterson and he said your mesh approach just won't work with "tetmesh" even with autosmooth turned off. There's no guarantee from the mesh algorithm that it will place nodes on the opposite side in exactly the same place.
This tends to bite people from time to time so I think you've motivated us to write a mesh sanity checker that will tell people the problem before they experience a nasty hang or crash like this. You might try meshing one surface, copying it, and then translating it to the opposite side.
Yeah, I realized it wasn't the right approach right after I made this issue but just thought it was odd I wasn't getting any error printout on the screen so that's why I reported it.
I really just want to do that same approach but with a hex8 sweep but where I bias surface 8 and 11 (the surface of the cylinder embedded in that mesh) so that the size stays constant in the cylinder volumes but gets coarser radially away but still respecting the periodic constrain along the axis.
Thanks for looking at this. Sorry the file was large. Is there any reason why libMesh has a problem with mapping nodes from one mesh to another "not exact" place? Why can't there be an "approximately" periodic boundary condition?
There is a tolerance in there somewhere but it's fairly tight. Anything that's not within that tolerance would be considered not equal and thus require some sort of projection to be correct.
On Mon, Mar 28, 2016 at 11:46 AM John notifications@github.com wrote:
Yeah, I realized it wasn't the right approach right after I made this issue but just thought it was odd I wasn't getting any error printout on the screen so that's why I reported it.
I really just want to do that same approach but with a hex8 sweep but where I bias surface 8 and 11 (the surface of the cylinder embedded in that mesh) so that the size stays constant in the cylinder volumes but gets coarser radially away but still respecting the periodic constrain along the axis.
Thanks for looking at this. Sorry the file was large. Is there any reason why libMesh has a problem with mapping nodes from one mesh to another "not exact" place? Why can't there be an "approximately" periodic boundary condition?
— You are receiving this because you commented.
Reply to this email directly or view it on GitHub https://github.com/idaholab/moose/issues/6643#issuecomment-202503485
Is there any reason why libMesh has a problem with mapping nodes from one mesh to another "not exact" place? Why can't there be an "approximately" periodic boundary condition?
Setting up the DOF constraint equations for this in the general case would probably be tricky, and as far as I know, no one has attempted it yet. I don't know of a reason why it wouldn't work in principle though, but perhaps @roystgnr knows more about that?
The trouble in principle is pinning, if what you want is an actual C0 conforming mesh. Imagine piecewise bilinear elements, with non-matching nodes except at the end points of the periodic edges. If email->GitHub works okay then take a look at this diagram:
A---B-------C-------D---E---F-------G
H--I-----J--K--L-----M-----N-----O--P
For continuity at AB, the solution slope there has to match both HI and IJ. Continuity along the res of IJ means that the slope at BC has to be the same still, and that constrains the slope at JK.
Continuing on, you find that where you intended to have 6 or 8 elements along the side, you really effectively only have 2: AC=HK and CG=KP. This kills your convergence.
The alternative is to not constrain your mesh to be C0 conforming; you could arbitrarily choose one of those sides to just interpolate the other. But in this case, either you've got some kind of DG handling of the jump terms, in which case you can rely on that rather than on interpolation for approximate conformity, or you don't have a formulation handling the jump terms, in which case you may not even be converging to the correct solution.
What about using AMR at some initial refinement stage to match the nodes prior to the solve commencing?
On Mon, Mar 28, 2016 at 3:38 PM, roystgnr notifications@github.com wrote:
The trouble in principle is pinning, if what you want is an actual C0 conforming mesh. Imagine piecewise bilinear elements, with non-matching nodes except at the end points of the periodic edges. If email->GitHub works okay then take a look at this diagram:
A---B-------C-------D---E---F-------G H--I-----J--K--L-----M-----N-----O--P
For continuity at AB, the solution slope there has to match both HI and IJ. Continuity along the res of IJ means that the slope at BC has to be the same still, and that constrains the slope at JK.
Continuing on, you find that where you intended to have 6 or 8 elements along the side, you really effectively only have 2: AC=HK and CG=KP. This kills your convergence.
The alternative is to not constrain your mesh to be C0 conforming; you could arbitrarily choose one of those sides to just interpolate the other. But in this case, either you've got some kind of DG handling of the jump terms, in which case you can rely on that rather than on interpolation for approximate conformity, or you don't have a formulation handling the jump terms, in which case you may not even be converging to the correct solution.
— You are receiving this because you authored the thread. Reply to this email directly or view it on GitHub https://github.com/idaholab/moose/issues/6643#issuecomment-202549902
What about using AMR at some initial refinement stage to match the nodes prior to the solve commencing?
We don't support anisotropic refinement, so if the grids start off mismatched, refining them won't make them match up...
Ah, yeah I was thinking that refining it could approximately eliminate) this problem described above of solutions HI and IJ not matching AB because if AB and HI -- IJ don't match, then you could use this as a marker to refine both sides of the grid.
On Mon, Mar 28, 2016 at 3:47 PM, John W. Peterson notifications@github.com wrote:
What about using AMR at some initial refinement stage to match the nodes prior to the solve commencing?
We don't support anisotropic refinement, so if the grids start off mismatched, refining them won't make them match up...
— You are receiving this because you authored the thread. Reply to this email directly or view it on GitHub https://github.com/idaholab/moose/issues/6643#issuecomment-202555455
Reclassifying this issue as a "task" now. We know why it fails, now we just have to decide if we should handle it better in opt mode.
Description of the enhancement or error report
No error message thrown when using a certain quasi-periodic mesh with tetrahedrons -- simulation just hangs in -opt, no console output, or in -dbg mode the simulation hangs after the mesh load information. Usually we get thrown a seg fault 11 or an error message about no boundary found from the translation vector if the input file isn't set up properly or the mesh isn't built right in CUBIT. This just seems to hang, no errors. If I use
scheme sweep
withhex8
elements, the input file runs without a problem.Rationale for the enhancement or information for reproducing the error
Using the following mesh recipe in cubit
and the following minimal boundary condition
along with any kernel (I think) will reproduce the problem. The mesh on either side of the periodic boundary conditions seems to be acceptable. Note that using
scheme sweep
works with the same input file so this is definitely just an issue withtetmesh
.Identified impact
Seems to be a bug that end users will see when using a common type of mesh with a fairly simple boundary condition -- having no error message is inconvenient to say the least.