Closed pgleeson closed 6 years ago
@skhayrulin The issue seems to be related to the water particles... The development branch matches the ow-0.8 branch for this configuration: worm_alone_half_resolution but worm_swim_half_resolution takes about 10 times longer when run with:
time ./Release/Sibernetic -f worm_swim_half_resolution -no_g -l_to timelimit=0.001 timestep=5e-06 logstep=1000 device=ALL
@pgleeson @skhayrulin It seems like this issue starts with https://github.com/openworm/sibernetic/commit/a52006ecf5edb0ef4a2ab634d1bd730bb3ca9ef6
git checkout 5942e6924454cf04f9e7646da8973686acfb4af6
make clean && make all
time ./Release/Sibernetic -f worm -no_g -l_to timelimit=0.001 timestep=5e-06 logstep=1000 device=ALL
...
[[ Step 199 (total steps: 200, t in sim: 0.000995s) ]]
_runHashParticles: 0.093 ms
_runSort: 11.240 ms
_runSortPostPass: 0.215 ms
_runIndexx: 0.135 ms
_runIndexPostPass: 0.408 ms
_runFindNeighbors: 5.755 ms
_runPCISPH: 16.971 ms 3 iteration(s)
membraneHandling: 1.187 ms
_readBuffer: 0.443 ms
------------------------------------
_Total_step_time: 36.448 ms
------------------------------------
~ 11 sec
git checkout a52006ecf5edb0ef4a2ab634d1bd730bb3ca9ef6
make clean && make all
time ./Release/Sibernetic -f worm -no_g -l_to timelimit=0.001 timestep=5e-06 logstep=1000 device=ALL
...
[[ Step 199 (total steps: 200, t in sim: 0.000995s) ]]
_runHashParticles: 0.177 ms
_runSort: 9.620 ms
_runSortPostPass: 0.258 ms
_runIndexx: 0.152 ms
_runIndexPostPass: 0.421 ms
_runFindNeighbors: 72.064 ms
_runPCISPH: 8.100 ms 3 iteration(s)
membraneHandling: 0.413 ms
_readBuffer: 0.433 ms
------------------------------------
_Total_step_time: 91.638 ms
------------------------------------
~ 22 sec
I added the last steps because there you can see that while the runtime of runFindNeighbors increased a lot, the runtime of runSort and runPCISPH decreased.
Thanks for looking into that @lungd!
@skhayrulin @a-palyanov This still seems to be an issue on the development branch. Any ideas?
Hi @pgleeson sorry my fault not enough time I'll try to look at nearest time
Thanks @skhayrulin, whenever you have a chance to look into it. Everything seems fine otherwise apart from just being slower.
Everything seems fine otherwise apart from just being slower.
I am not sure about that (@pgleeson, @skhayrulin) I run some simulations using the docker container, i.e. Sibernetic simulations (worm_crawl_half_resolution) with c302, and got the following result:
3eb9914
3eb9914 oclsourcepath=src/sphFluid_crawling.cl
development (2f2bf84)
development (2f2bf84) oclsourcepath=src/sphFluid_crawling.cl
same output as with sphFluid.cl
The position_buffer file, using the latest dev version, is corrupted and the manually started rerun of the simulation stops immediately.
$ tail position_buffer.txt
nan nan nan 1.2
nan nan nan 1.2
That issue is probably a little bit offt-opic.
@skhayrulin @a-palyanov, should sphFluid_crawling.cl only be used in combination with the config 'worm_crawling'?
Regarding performance, I guess the performance degradation comes from calculating and saving additional data (pressure)?
That commit should resolve this issue and probably #146 too: 7d9abd4
There seems to be a performance degradation in the latest development version of Sibernetic. Can be tested with the Docker image or with the current development branch:
The above takes ~32 sec on my machine
This takes ~3 sec...