ComputationalRadiationPhysics / picongpu

Performance-Portable Particle-in-Cell Simulations for the Exascale Era :sparkles:
https://picongpu.readthedocs.io
Other
705 stars 218 forks source link

[Feature Request] Add initialization option with constant particle weighting #3213

Open BrianMarre opened 4 years ago

BrianMarre commented 4 years ago

Short version: Implementing this would be the first order solution I require for collisional excitation/ionization in my topic-atomicPhysics pull request.

Longer Version: I am currently implementing the change of atomic state due to ion-electron collisions, on super cell level. The basic idea is to change the single atomic state stored in an ion-macroparticle due to collisions with the electron velocity distribution. If the velocity distribution is not isotrope, the collision rate R of an ion depends on the relative velocity of this ion and electrons of a given energy E.

R_col = | v_rel(Eelectron) | * sigma{Cross Section} (E_electron) * density_ne(E_electron)

For ion acceleration, neither electron- nor ion-velocity distributions are isotrope and therefore I would need the actual relative velocity to calculate the collision rate.

The naive approach would be to use a 3-dimensional electron velocity histogram and calculate the relative velocity of a given bin to a given ion-macroparticles velocity, repeat for all ion-macroparticles. Unfortunately I need a good enough energy resolution, eg. something like 256 bins per dimension, which would result in 256^3 different bins, with the resulting histogram being too large to fit into shared memory and the simulation having too few macroparticles to actually sample the distribution enough, only approx. 5150 macroparcticles.

I want to avoid this by using a random binary pairing of electron- and ion-macroparticles. To conserve energy, the kinetic energy of the colliding electron must be reduced or increased by the energy consumed/released by the atomic state change, if a collision happens. If both particles have the same weight this is simply a matter of scaling the macroparticles velocity. If the weights are not equal, I could scale the velocity as if the weights were equal, this could break energy conservation, since the energy take from the electron macroparticle is higher or lower than the atomic state energy, or scale the macroparticles velocity differently depending on the weighting relation of ion- and electron- macroparticle, this would sacrifice the correct change of velocity and thereby PIC-dynamic, by averaging over to different electron populations.

Other possible solutions might be,

BrianMarre commented 4 years ago

@n01r and @ax3l, do you have any further ideas, how this might be solved? @psychocoderHPC are there any further considerations I should be aware of?

n01r commented 4 years ago

first offline discussion result: start with the implementation of an isotropic velocity distribution since the ion velocity will at first be negligible compared to the electron velocity. That is the widely accepted method against which you can compare later when you move on from there with the next, more complex, treatment.

HighIander commented 4 years ago

A solution to the problem of " the simulation having too few macroparticles to actually sample the distribution enough, only approx. 5150 macroparcticles." would be to use a dynamic lookup table and of an index i to energy bin j and only use those j that are sufficiently occupied. Your histogram then only goes over i, which number is much smaller than the number of all j = 256^3 in you example. In most situations that would also solve the problem "with the resulting histogram being too large to fit into shared memory", but one would have to careful for those cases where it wouldn't.