Closed MakisH closed 5 years ago
And here is the domain decomposition (upper: Fluid, lower: Solid):
Hard to tell. computeState
is a lengthy method. The line number (or code block) where it crashes would probably help a lot.
Also, could you please switch on Debug and Trace output in ReceivedPartition
?
I already have <sink type="stream" output="stdout" filter= "%Severity% >= trace" enabled="true" />
in the config and I have built with RelWithDebInfo
. What else could I enable?
What else could I enable?
I recently had similar issues. Are you sure that you link against the correct libprecice
? (For sure the logging output above does not contain debug output)
I refactored this behemoth of a function in #369. Maybe this helps to track down the problem.
We now see that it goes down to Eigen, more specifically to normalized
, which is used a few times in computeState
. In both your systems, do you use different versions of Eigen?
My best current guess is that we might have defective edges or triangles. So edges with length 0, etc., but so far it's only a guess. I need to have a closer look.
To check that, you could also take the system that works and plot the mesh connectivity, e.g. vtk output.
Indeed, it could be originating from wrong description of the triangles, although it works in serial and the problem does not seem to be in the parallel boundary.
Serial:
Parallel:
Both pictures are from the working system, at dt99
. I only export from the Solid participant, which does not use the Fluid-Centers
in the parallel variant.
@DavidSCN do you get a similar picture in parallel?
Unfortunately yes. But exporting the Fluid-Mesh-Nodes
on the Fluid
participant shows no erroneous behavior:
Could this originate from the mapping? What exactly is the difference between two participants exporting a certain mesh?!
Could this originate from the mapping? What exactly is the difference between two participants exporting a certain mesh?!
The receiving participant filters out mesh parts that it does not need (for the mapping). So, these meshes can be perfectly normal. To understand this better, have a look of the Solid Mesh of the Participant Solid.
Could you please upload the vtk's?
Which Eigen versions do you use?
Here are the vtk files I used. In the working system, I use Eigen 3.3.4 (from the Ubuntu 18.04 repositories). In the system it crashes, I use Eigen 3.3~beta1-2 (from the Ubuntu 16.04 repositories).
Ok this makes sense, since the solid mesh is coarser then the fluid mesh. Still, some nodes/triangles are skipped:
I attached the exports from the Fluid
and Solid
participant:
mesh_files.zip
I use the libeigen-dev package from 18.04, which should be the same eigen version as mentioned above.
Anyway, this might be a problem, but is not the reason for this issue.
The incorrect filtering is to some extent an old friend as we have seen a similar behavior in FEAP. I opened another issue #371. Could maybe have a similar reason as this issue as I suspect a problem with the normal vectors of triangles, which are also computed in computeState()
. I will first look into #371
In the OpenFOAM-adapter, we apply the precice function setMeshTriangleWithEdges(meshID, v1, v2, v3)
. I assume, that the order of vertices v1, v2, v3 for the triangle definition doesn't matter for precice.
I am trying to reproduce this error with the latest develop and the same OpenFOAM adapter/tutorial and I get the following error in both participants:
I assume that this means that we are trying to make some triangles with two or three points being the same. Strange that we now get this although before we got at least one of the two meshes (both, I think) successfully triangulated.
@DavidSCN did you get the same error?
edit: I get this both with OpenFOAM v1906 and with OpenFOAM 5.
Yes, I will have a look at the implementation again.
Strange that we now get this although before we got at least one of the two meshes (both, I think) successfully triangulated.
This safety check was only recently introduced.
Yes, I meant that it should not complain even with the check, since before this was not an issue.
But good that simply the check is wrong at the end!
Still, after merging #503, I get the same error in the OpenFOAM adapter. I assume this means that there are some wrong and some correct triangles eitherway and this is then an adapter issue. @KyleDavisSA how is nearest projection in CalculiX affected by this?
The situation has not changed after merging #503 from the adapter side: setMeshTriangleWithEdges
fails already the first time it is called, although all three IDs are different.
Can you reproduce this behavior with a test?
The checks are now fine, I still need to check if all the triangles are ok.
Just checked the adapter side and there and there are no error messages anymore
So! This looks good on my side. No errors and all the triangles look good. The parallel vtks look a bit weird, but the results look good, so this is a separate (visualization-related) issue.
Here are all the complete results: buoyantPimpleFoam-laplacianFoam_nearest-projection_vtk.tar.gz
I stumbled upon a crash while trying to test Nearest-Projection mapping with OpenFOAM (see https://github.com/precice/openfoam-adapter/pull/46) in parallel. Since we don't use Nearest-Projection so widely, maybe this is an old problem, originating from preCICE. In any case, no error message is given.
Some details:
ReceivedPartition::compute()
:https://github.com/precice/precice/blob/143b6e5043c5bad984cc8578e7c95d2dd3590b77/src/partition/ReceivedPartition.cpp#L198-L207
void Mesh:: computeState()
:https://github.com/precice/precice/blob/143b6e5043c5bad984cc8578e7c95d2dd3590b77/src/mesh/Mesh.cpp#L279