Closed LukeMondy closed 6 years ago
Removing the q2/dq1
elements has the same result.
Removing the SLCN
has the same result.
Weird. I will debug this later this week... I have been having issues with parallelism since moving to python3.... There is a python2 image on docker hub if you want to try. The error points the callback function that is called just after the solver call....Could be related to your "custom" solver...I would expect it not to work in serial though....
Note that the dQ2/dQ1 combination is largely untested!!!
Yeah, that's fine, I'm not using it my main models.
I agree that the python2 version seems to work much better in parallel - both in terms of speed up, and proper output. For example, the python3 version would not output the timesteps, or any print statement in my input script, until about 80 timesteps passed - and then it dumped them all out at once.
It looks like the python2 version hasn't been updated to the latest master version on dockerhub, since it doesn't have the slade solver, for example. Would you be able to bump it?
@LukeMondy I have seen the same prblems with py3 with underworld as well, no output of the print will the model end and or crashes. its about 1.6 ~ 1.8 times slower than py2 implementation of underworld as welll as UWGeo
I have fixed the print statement yesterday. As for the slowness... I still have to look what's going on. Have you got an example?
So @arijitlaik, you said your underworld models are slower. So it's not just UWGeo?
Might be good to flag that on the underworld repo
yes uw models are slow in python3 not just UWGeo, i will do that, with a uw.timing example. soon. busy reading stuff for a while
OK so for the print statement. You need to explicitly flush them to screen using flush=True in the print statement itself... That's python 3 specific. You can also run the model using python -u which will prevent buffering the strings..
Ya, figured that out.
I am going to use the v27 and development docker for timing tests and put them up.
I was just trying to reproduce the issue #65 , but now I face a different issue.
I take one of these models: https://github.com/EarthByte/UW2-tests-and-benchmarks/blob/master/isostasy/Isostasy%205%20-%20weak%20centre%20with%20sediments%20-%20PressureBC.ipynb and export it to a python file.
When I run it inside the latest underworld2_geodynamics (latest or dev), with this:
python isostasy.py
it works fine.When I run it with:
mpirun -np 2 python isostasy.py # or any other number of processes
I get:If I change the model resolution to be higher, like 100x100, or 96x96, I see the same result.