tumaer / lagrangebench

LagrangeBench: A Lagrangian Fluid Mechanics Benchmarking Suite
https://lagrangebench.readthedocs.io
MIT License
57 stars 6 forks source link

Inconsistent ∆t values #32

Closed iSach closed 1 month ago

iSach commented 1 month ago

Hello,

In the paper, the reported ∆t value for the 2D dam break dataset is 0.03. In the metadata.json file, we can see:

    "case": "DAM",
    "solver": "SPH",
    "density_evolution": true,
    "dim": 2,
    "dx": 0.02,
    "dt": 0.0003,
    "t_end": 12.0,

Considering t_end is 12 and there are 401 steps, I guess the correct value is indeed 0.03. Is there a reason it is 3e-4 here? Are there other cases of this maybe?

Thanks in advance and best regards, Sacha

arturtoshev commented 1 month ago

Thanks for raising that point! The number of steps is not t_end/dt, but rather t_end/(dt*write_every), with write_every=100 being the temporal coarsening step, which is also included in the metadata.json files.

Most of the values in the metadata files are the parameters used to run the SPH solver, e.g. dt is the integration step the SPH solver did. If you want to see which parameters come from the solver and which are added during dataset generation (e.g. velocity and acceleration statistics), please have a look here: https://github.com/tumaer/lagrangebench/blob/main/data_gen/lagrangebench_data/gen_dataset.py

If you search through the repo for write_every, you will not see much (https://github.com/search?q=repo%3Atumaer%2Flagrangebench%20write_every&type=code) because from the perspective of the ML model, everything is normalized, and dt=1 (https://github.com/tumaer/lagrangebench/blob/main/lagrangebench/case_setup/case.py#L256).

I hope this helps. Let me know if you still have questions!

Best, Artur

iSach commented 1 month ago

Hello Artur,

Thanks for this detailed answer, this makes more sense now for me, I did not notice the write_every, my bad! :)

I am currently trying to benchmark a bunch of models on this 2D dam dataset, including the DMCF model (https://github.com/tum-pbs/DMCF). It requires the dt value for integrating external forces such as gravity. When training a model with dt=0.03 and gravity=-1.0 as I found in these config files, the model barely learns anything and does not converge at all, which I found quite surprising considering its performance on other datasets. The velocities in my case have been correctly rescaled using the dt when converting the dataset.

Do you perhaps know if somebody has done this before with DMCF, or maybe you have some experience with it?

Thanks again for your replies and time, Sacha

arturtoshev commented 1 month ago

Benchmarking DMCF is on my longer-term to-do list, but I haven't tried it yet. By the way, as far as I'm aware, SFBC (https://arxiv.org/abs/2403.16680) is the successor of DMCF, and on top of that, SFBC should be easier to implement (https://github.com/tum-pbs/SFBC).

Regarding proper benchmarking, there are three things that I can imagine going wrong at the top of my mind:

If you want to have a Zoom call in the next days, just drop me an email at artur.toshev@tum.de.

Best, Artur.

iSach commented 1 month ago

Thanks a lot for this detailed answer and the insights, I will investigate further. I am also thinking about benchmarking SFBC, but the code is less trivial to make the datasets I'm using to work with it (only a list of positions basically).

I will contact you if I still require help, thanks a lot!!

Cheers, Sacha