gandalfcode / gandalf

GANDALF (Graphical Astrophysics code for N-body Dynamics And Lagrangian Fluids)
GNU General Public License v2.0
44 stars 12 forks source link

Mpi reduced #60

Closed giovanni-rosotti closed 8 years ago

giovanni-rosotti commented 8 years ago

Summary: the exporting and importing of particles were streamlined and switched to non blocking communications. Communication of the pruned trees is also done now in a non blocking fashion. While testing I noticed that the hydro with MPI is actually not correct; you can see artefacts at the edge of the domain. As this affects also the current master branch, I think we should go ahead with this merge, and I'll investigate the issue separately.

rbooth200 commented 8 years ago

Was the hydro with MPI broken before? I would be against pushing something to the master that actively breaks something unless there is a good reason

giovanni-rosotti commented 8 years ago

Yes, as I was saying I tested with the current master and it's broken also there. That's why I think we should go ahead anyway, and open a new branch to understand what is going on, as the issue is not due to the modifications done in this branch...

rbooth200 commented 8 years ago

Sounds fine to me then.

dhubber commented 8 years ago

Sure, if it's also broken in the master branch and not introducing a new bug, then merging sounds fine!

On 7 April 2016 at 14:12, Richard Booth notifications@github.com wrote:

Sounds fine to me then.

— You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub https://github.com/gandalfcode/gandalf/pull/60#issuecomment-206893485

giovanni-rosotti commented 8 years ago

Ok, wonderful. But wait for merging as I've broken the bruteforce while doing these updates, I'll fix it now

giovanni-rosotti commented 8 years ago

Ok guys - I think it's now ready to be merged, if you don't see any problem with it

giovanni-rosotti commented 8 years ago

David, give a look please at this last commit, there was a stupid problem with the load balancing that was leading to a crash. Do you see any problem with it? (I just turned > into >=)

giovanni-rosotti commented 8 years ago

This last commit fixes the time dependent artificial viscosity, which was the cause of the artifacts I described earlier. While testing I also realised the brute force might crash with MPI as FindLoadBalancingDivision always returns 0 (which might be outside the domain). The easy fix would be just to return the old value; but as Richard is working to make also the brute force a tree, there's no point in doing it.

dhubber commented 8 years ago

Giovanni, as the comment by that line says, that if statement was some fix to prevent crazy movements of particles leading to the new bounding box not containing the old division position. If you remove the if statement (as in always execute the line inside it which computes r_divide as the average of the min and max bounding box), then it should remove any potential problem. And if there is still a crash, then it's cause is obviously somewhere else.

rbooth200 commented 8 years ago

I'm waiting for a fix to the MPI that allows tree levels of unknown size. Once we have that we should merge this.

giovanni-rosotti commented 8 years ago

Good - but it might not happen before the ALMA deadline :)

giovanni-rosotti commented 8 years ago

I think I've fixed it. I artificially tested it by making GetMaxCellNumber return -1 for the KDTree.