Closed davidscn closed 8 months ago
Thanks for the PR! While it goes into the right direction, I still get the following (different) output from flow-over-heated-plate-partitioned-flow/fluid1-openfoam
:
---[preciceAdapter] preCICE was configured and initialized
---[precice] ERROR: Input/Output sizes are inconsistent attempting to read 1D data "Pressure" from mesh "Fluid1-Fluid-Mesh". You passed 41 vertices and 82 data components, but we expected 41 data components (1 x 41).
Similarly for partitioned-backwards-facing-step/fluid1-openfoam
:
---[preciceAdapter] preCICE was configured and initialized
---[precice] ERROR: Input/Output sizes are inconsistent attempting to read 1D data "Pressure" from mesh "Fluid1-Mesh". You passed 32 vertices and 96 data components, but we expected 32 data components (1 x 32).
For the partitioned-pipe/fluid1-openfoam-pimplefoam
, I still get an assertion failure:
---[precice] Computing "nearest-neighbor" mapping from mesh "Fluid2-Mesh" to mesh "Fluid1-Mesh" in "read" direction.
---[precice] Mapping distance min:0 max:7.17448e-15 avg: 2.20885e-15 var: 4.2847e-30 cnt: 500
---[precice] Mapping "Pressure" for t=0 from "Fluid2-Mesh" to "Fluid1-Mesh"
---[precice] Mapping "Pressure" for t=1 from "Fluid2-Mesh" to "Fluid1-Mesh"
ASSERTION FAILED
Location: void precice::impl::DataContext::mapData()
File: /home/gc/repos/precice/precice/src/precice/impl/DataContext.cpp:95
Expression: context.fromData->stamples().size() > 0
Rank: 0
Arguments: none
Had to adjust the buffer reading as well, as reading and writing share the same buffer.
For the partitioned-pipe/fluid1-openfoam-pimplefoam, I still get an assertion failure:
I get the same, but I don't see how the changes here touch anything in the logical order of reading and writing. Seems to be related to @BenjaminRodenberg's work.
but I cannot yet confirm that it solves the issue.
The exception above is only triggered for the partitioned-pipe. You could try one of the other examples. I wouldn't wait here for changes in precice/precice.
Indeed, the partitioned flow over heated plate works. But the partitioned BFS shows strange results:
Interesting. First of all, running the case using the current precice@develop and the default configuration, no convergence within individual time-steps can be observed, i.e., we perform 100 implicit iterations within each time-step and move on afterwards.
However, using in the configuration file <exchange ... substeps="0" />
for all data exchanges shows proper convergence behavior and the result looks fine as well
To me, this seems unrelated to the changes here and should go into a dedicated issue either here or in the tutorials repo.
Note that both configurations hit an assert after around 2 secs. Without substeps, I get the following:
ASSERTION FAILED
Location: void precice::cplscheme::BaseCouplingScheme::sendData(const PtrM2N&, const DataMap&, bool)
File: precice/src/cplscheme/BaseCouplingScheme.cpp:144
Expression: math::equals(stamples.back().timestamp, time::Storage::WINDOW_END)
Rank: 0
Arguments: none
For the assertions that you run into, are there already issues in preCICE? They look definitely like edge cases we have not yet covered with the waveforms. We should try to reproduce them with integration tests. Having the config in the issues could already be enough to create these tests.
Note that both configurations hit an assert after around 2 secs. Without substeps, I get the following:
ASSERTION FAILED Location: void precice::cplscheme::BaseCouplingScheme::sendData(const PtrM2N&, const DataMap&, bool) File: precice/src/cplscheme/BaseCouplingScheme.cpp:144 Expression: math::equals(stamples.back().timestamp, time::Storage::WINDOW_END) Rank: 0 Arguments: none
This is the already encountered https://github.com/precice/precice/issues/1776 https://github.com/precice/precice/issues/1751
ASSERTION FAILED Location: void precice::impl::DataContext::mapData() File: /home/gc/repos/precice/precice/src/precice/impl/DataContext.cpp:95 Expression: context.fromData->stamples().size() > 0 Rank: 0 Arguments: none
Opened an issue: https://github.com/precice/precice/issues/1803
However, using in the configuration file
<exchange ... substeps="0" />
for all data exchanges shows proper convergence behavior and the result looks fine as well ... To me, this seems unrelated to the changes here and should go into a dedicated issue either here or in the tutorials repo.
@davidscn Why not an issue in the core library? Do you suspect we are reading/writing something wrong?
@davidscn Why not an issue in the core library? Do you suspect we are reading/writing something wrong?
I was referring to the lacking convergence, not to any assertion we hit. Once we change the substeps default back to false
, everything runs fine. However, the substeps=true
option doesn't work and it could be an issue on how this case was set up or some interpolation issue in the waveform. I'm not completely sure and the tutorials seem to be the proper place to discuss.
Opened an issue for the convergence issues with substeps: https://github.com/precice/tutorials/issues/373
Be aware that quasi Newton schemes are currently not supported, if substeps="true"
. A corresponding error message is on the way https://github.com/precice/precice/pull/1721.
For the FEniCS adapter the partitioned heat equations works with underrelaxation and subcycling="true"
(slow, but good for testing). See https://github.com/precice/tutorials/pull/281. Do you have an OpenFOAM case with underrelaxation that you could use for testing the new subcycling feature with your adapter?
Closes #303
I added now a return type to
CouplingDataUser::write()
to tell writeData how many data entries are actually relevant. Adding this return type is also a nice consistency check for theCouplingDataUser::write()
function and not uncommon for such functions.TODO list:
docs/
changelog-entries/
(create directory if missing)