quokka-astro / quokka

Two-moment AMR radiation hydrodynamics (with self-gravity, particles, and chemistry) on CPUs/GPUs for astrophysics
https://quokka-astro.github.io/quokka/
MIT License
46 stars 12 forks source link

add cloud-in-cell particles (+virtuals and ghosts) #332

Open BenWibking opened 1 year ago

BenWibking commented 1 year ago

Plan to add basic particle support:

"Basic particle support" means:

BenWibking commented 1 year ago

The Nyx implementation is also very similar, with virtual and ghost particles: https://github.com/AMReX-Astro/Nyx/blob/development/Source/Particle/NyxParticles.cpp

BenWibking commented 1 year ago

Andrew Myers pointed out that the Nyx paper (https://arxiv.org/pdf/1301.4498.pdf) describes the rationale for the ghost and virtual particles:

In the multilevel algorithm, the cloud-in-cell procedure becomes more complicated. Dur-
ing a level ℓ level solve or a multilevel solve at levels ℓ → m, a particle at level ℓ that lives
near the ℓ/(ℓ − 1) interface may “lose” part of its mass across the interface. In addition,
during a level solve at level ℓ, or a multi-level solve at levels ℓ → m, it is essential to include
the contribution from dark matter particle at other levels even if these particles do not lie
near a coarse-fine boundary. Both of these complications are addressed by the creation and
use of ghost and virtual particles.

The mass from most particles at levels ℓ′ < ℓ is accounted for in the Dirichlet boundary
conditions for φℓ that are supplied from φℓ−1 at the ℓ/(ℓ − 1) interface. However, level ℓ − 1
particles that live near enough to the ℓ/(ℓ − 1) interface to deposit part of their mass at level
ℓ, or have actually moved from level ℓ − 1 to level ℓ during a level ℓ − 1 timestep, are not
correctly accounted for in the boundary conditions. The effect of these is included via ghost
particles, which are copies at level ℓ of level ℓ − 1 particles. Although these ghost particles
live and move at level ℓ, they retain the size, ∆xℓ−1 of the particles of which they are copies.

The mass from particles living at levels ℓ′ > m in a level m level solve or a level ℓ → m
multilevel solve, is included at level m via the creation of virtual particles. These are level
m representations of particles that live at levels ℓ′ > m. Within Nyx there is the option to
have one-to-one correspondence of fine level particles to level m virtual particles; the option
also exists to aggregate these fine level particles into fewer, more massive virtual particles
at level m. The total mass of the finer level particles is conserved in either representation.
These virtual particles are created at level m at the beginning of a level m timestep and
moved along with the real level m particles; this is necessary because the level ℓ′ particles
will not be moved until after the second Poisson solve with top level m has occurred. The
mass of the virtual particles contributes to ρdm at level m following the same cloud-in-cell
procedure as followed by the level m particles, except that the virtual particles have size
∆xm+1 (even if they are copies of particles at level ℓ′ with ℓ′ > m + 1).
BenWibking commented 1 year ago

@markkrumholz Do you have any thoughts on this? For sink particles, this seems more complicated than doing the naive $O(N^2)$ force calculation instead of cloud-in-cell. But then it wouldn't generalize to dark matter or star particles.

markkrumholz commented 1 year ago

Just letting you know that I am thinking about this, but I am also currently buried under multiple PhD thesis drafts that need comments, so this it not at the top of the priority list. Will probably get to it early next week.

BenWibking commented 1 year ago

@pakshingli This is the issue we are using to keep track of particle support in Quokka. Let me know what you think about the plan above.

pakshingli commented 1 year ago

@BenWibking My understanding is cloud-in-cell (CIC) is a pretty straight forward method. The mass of a particle is distributed to the nearby grid points according to a simple algorithm. You mentioned that there may be a complication when the particle is very close to the coarse/fine grid boundary. There is no problem of losing mass. CIC is distributing mass to the 8 grid points of the cell. Even when the particle is right at the edge of the cell and the cell is right at the coarse/fine boundary, the mass of the particle still being distributed to the grid points right on the fine grid.

I haven't read the Nyx paper yet. So I don't know what the virtual and ghost particles are. Anyway, when we have a particle, with mass, we can simply use CIC to distribute the mass to the grid at any level ℓ. The Possion solver will then take care of the calculation.

In ORION2, all processors contain the entire list of sink particles. If a particle is within the grids of a processor, the processor with deal with the particle. We may need a better approach of not keeping the entire list of sink particles for every processor. I will see if the "virtual" or "ghost" particle can do something about this. Anyway, solving Poisson equation for particle using CIC method shouldn't be a big problem.

markkrumholz commented 1 year ago

OK, I have now had a chance to think some about this, and also about the gravity solve. This seems like a situation where we might want to proceed in stages. A defining characteristic of sink particles is that we guarantee that some volume around them defined by their accretion radius (which for reasons of stability is always going to be larger than their effective CIC size) is refined to maximum level. Thus for sink particles there is no need to implement virtual particles. We do need ghost particles, but that is going to required reagrdless just to handle sink accretion, because accretion zones can also extend across grid boundaries. As @pakshingli mentioned, in ORION we implement this just by globally synchronizing the particle list, which allow us to do direct N^2 integration of particle orbits for maximum accuracy, and is no more expensive than distributing the particles in memory when the number of particles is small (<~ 1000) because in that case the cost of data transmission is zero compared to the overhead associated with an MPI call, so you might as well just do a single all-to-all and make life simple. We probably don't want to optimize for that, and also now with more experience I think that there is little benefit to doing the direct N^2 rather than just doing CIC (mainly because even if you do direct N^2 you are going to be limited by the fact that your hydro is effectively softened at the grid scale), so I would suggest using the particle synchronisation stuff already built, which includes ghost particles, and then doing CIC deposition. If we then come back and want to support star / dark matter particles later, we can always implement that, since this will never be used for sinks anyway. I'll comment on the gravity solve strategy over on the other thread.

BenWibking commented 1 year ago

A defining characteristic of sink particles is that we guarantee that some volume around them defined by their accretion radius (which for reasons of stability is always going to be larger than their effective CIC size) is refined to maximum level. Thus for sink particles there is no need to implement virtual particles.

I agree that this is true for the accretion/winds, but I don't think this is true for gravity. We still want to account for the gravity due to the sink particle on coarser grids, no?

markkrumholz commented 1 year ago

I guess with a CIC method you're right -- we need a version of the particle on the coarser level so that we can properly add its mass to the coarse grid. This isn't an issue with ORION because the particles are global, but I guess it will be here where particles have to be associated wtih a particular grid.

BenWibking commented 1 year ago

@pakshingli There is a new Particle API for pure SOA (struct-of-array) particles here: https://github.com/AMReX-Codes/amrex/pull/2878. It is much more performant than the old mixed particle struct format on GPU.

There is no documentation yet, but there is an example code here: https://github.com/AMReX-Codes/amrex/blob/development/Tests/Particles/SOAParticle/main.cpp

pakshingli commented 1 year ago

Thanks Ben. I will check it out later.