DenSto / APC_524_Project

APC 524 project
1 stars 0 forks source link

MPI on a cube #2

Closed DenSto closed 7 years ago

DenSto commented 7 years ago

So the issue Ian raised about the limited timestep we can use got me thinking... Right now we plan on domain decomposition in just one direction. As I code up the particle boundaries I'm realizing that if one can decompose in one direction, it's a trivial extension to do it in every dimension, at least for the particles. I haven't given it much thought about the fields, though. For anybody who's worked on the grid, what do you think? Has anybody been working on the grid boundary conditions?

jlestz commented 7 years ago

We considered this in one of the early meetings. I think our conclusions were that while the the decomposition is not difficult, it opens up the possibility for many new errors, and will make debugging much more difficult than a 1D decomposition. Just as an example, particles could cross multiple MPI domains in a single time step, which requires more careful handling than crossing a single domain.

My vote is to have 3D decomposition be one of the prioritized stretch goals for beta. But we should get all features working and validate some physics cases with 1D decomposition before tackling this.

DenSto commented 7 years ago

Actually I thought about it, and particles crossing multiple processes is actually not a problem whatsoever (I can show how to trivially address this).

If the grids can do this easily, I don't see why we can't implement it this way. I think I'll be easier to do the general case now, then do the specific case and then try to generalize.