precice / fenics-adapter

preCICE-adapter for the open source computing platform FEniCS
GNU Lesser General Public License v3.0
29 stars 15 forks source link

Adding connectivity data for volume coupling #152

Closed boris-martin closed 2 years ago

boris-martin commented 2 years ago

Added configuration of connectivity data for volume coupling: triangles from the coupling domain are added to the preCICE mesh. It (seems to) works in serial & parallel, however the parallel version has "holes", as triangles made of owned+unowned vertices are discared:

image

This is however consistent with the current implementation, which does the same in surface coupling with edges.

Questions:

Code for testing:

fenics-adapter-test.tar.gz

BenjaminRodenberg commented 2 years ago

Nice to see that there is some improvement in the volume coupling direction :)

What do you mean with "it seems to work in parallel", but at the same time there are holes? If the holes don't have an influence on the result and they are only a visualization artifact, I would not worry too much here.

If there is a general problem with the parallel version, there are two questions I would try to answer before investing too much here: 1) Do you really need to support parallel runs here? You could also just not support parallel for this case (we also don't support parallel runs together with PointSources, for example) 2) If you need parallel runs, would it be possible to move to FEniCS-X instead? Moving your example to FEniCS-X might be a big and risky step though, because our FEniCS-X adapter is not stable yet.

The reason I'm asking: (legacy) FEniCS is at its end of life and I've heard that FEniCS-X comes with some improvements w.r.t parallelization. Investing a lot of time into (legacy) FEniCS for a feature that you might not even need might be a bad idea.

boris-martin commented 2 years ago

"It seems to work" was just to highlight the fact it worked on my test case but maybe there are edge cases my test didn't cover. But in my case it works 100% as expected. But parallelization does lead to a different results, as elements are missing. An analogy in 1D: if you have a splitted domain with a middle vertex as interface, data mapping would do linear interpolation between vertices, except on the edge connecting the boundary, where it would fall back on NN because the boundary isn't part of the preCICE mesh.

So that's not ideal but still better than a pure NN. But the problems lies in the fact that the FEniCS adapter only sets up points that are owned by the rank and not the non-owned shared ones. If we changed that, we could easily add the edges then triangles, but this would require changes out of this PR. (See http://precice.org/couple-your-code-distributed-meshes.html#use-a-single-mesh-and-duplicate-copied-vertices )

I don't personally need parallel runs, I mostly implemented this to keep the best generality possible.

IshaanDesai commented 2 years ago

Some clarification on the points raised in the discussion so far:

Currently this is always done. Should it be a parameter, or should we rely on preCICE to ignore useless connectivity? THere are no checks for edges and I went for consistency first...

Always passing on connectivity information is fine for now. This would be rectified when https://github.com/precice/fenics-adapter/pull/138 is merged anyway.

There is an assertion that the mesh is made of triangles. Not ideal, but I'm open to suggestions for alternatives. How would it work with quads or in 3D ?

Lets leave the implementation at 2D for now, anyway the adapter is highly restrictive in 3D. Lets document the restriction of having linear cell interpolation mapping only in 2D. Same goes for the parallelization, I don't think it is worth investing effort in making this work efficiently for parallel cases.