SmileiPIC / Smilei

Particle-in-cell code for plasma simulation
https://smileipic.github.io/Smilei
344 stars 121 forks source link

maximum in bin #485

Closed ccaizergues closed 2 years ago

ccaizergues commented 2 years ago

Hi Smilei team, I would like to evaluate the maximum value within each bin of a ParticleBinning diag (x,y mesh). I tried the following user defined functions which I hopped to return identical values:

def p2max_A(p):
    pp = p.px**2 + p.py**2 + p.pz**2
    val = np.max(pp,axis=0)
    return np.full_like(p.weight, val/len(p.weight))
def p2max_B(p):
    pp = p.px**2 + p.py**2 + p.pz**2
    tmp = np.array(0)
    for yi in range(len(p.weight)):
        if tmp < pp[yi]:
            tmp = pp[yi]
    return np.full_like(p.weight, tmp/len(p.weight))

Only p2max_B seems to return a correct value. The p2max_A function leads to the reported error message below. Did you already successfully used this np.max or np.amax functions for ParticleBinning diags ? (other numpy functions seem OK)

[irene4894:239824:0:239824] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x10)

/usr/include/c++/4.8.5/bits/stl_vector.h: [ std::vector<tagPyArrayObject*, std::allocator<tagPyArrayObject*> >::_M_erase_at_end() ]
      ...

/usr/include/c++/4.8.5/bits/stl_vector.h: [ std::vector<tagPyArrayObject*, std::allocator<tagPyArrayObject*> >::_M_erase_at_end() ]
      ...
     1350       {
     1351   std::_Destroy(__pos, this->_M_impl._M_finish, _M_get_Tp_allocator());
     1352   this->_M_impl._M_finish = __pos;
==>  1353       }
     1354 
     1355 #if __cplusplus >= 201103L
     1356     private:
     1350       {
     1351   std::_Destroy(__pos, this->_M_impl._M_finish, _M_get_Tp_allocator());
     1352   this->_M_impl._M_finish = __pos;
==>  1353       }
     1354 
     1355 #if __cplusplus >= 201103L
     1356     private:
mccoys commented 2 years ago

The error is not sufficient to understand the issue. Are you sure the returned value is always an array ? What happens if there is 0 particles in a patch?

ccaizergues commented 2 years ago

There was the point, the max functions do not handle an empty array. With a test condition, it's now OK.

I try to compare this p2max in a bin to the average energy per particle in a bin (same mesh x,y for all diags, and only using ParticleBinning diags) For a given species A, the two compared maps are:

1)   'weight_ekin'  /  'weight' 
2)   'p2max'  * m_A / 2  / diag._bsize

The p2max is in fact here gamma2*v2max (it is specified in the documentation but only in the TrackParticles diagnostics section). I would expect always 2) >= 1) for non relativistic particles, yet where there are sharp density gradients I get some points with 1) >> 2) ( for protons with E~10MeV).

Is this an expected behavior, or is this comparison non valid ?

mccoys commented 2 years ago

Maybe because px py and pz are not exactly momenta in this situation, but gamma × v ?

ccaizergues commented 2 years ago

I do not see where could be the problem with gamma × v, as just multiplying by the scalar mass gives back p. One possible problem comes from the calculation of the kinetic energy from p. Yet, here protons are non relativist so p²/2m is already a very good approximation (and next contribution in taylor development is negative so would not help).

Here is what I got along x for two different values of y in the laser-plasma interaction region (issue is for average kinetic : 1) > max value : 2) )

image

mccoys commented 2 years ago

I understand. Why do you divide by bsize? If I remember correctly, happi divides by bsize so you should multiply to cancel this operation

ccaizergues commented 2 years ago

If I am correct, I previously noted that the bsize parameter is not the value used for normalization but its inverse (then I divide rather). In this case bsize = 0.037995 (rounded to 0.038 in the graphs)

mccoys commented 2 years ago

Ah yes this is correct.

Maybe the problem is because the result of the computation is summed for all patches ? You must make sure that the quantity evaluated in deposited_quantity is additive. This does not seem correct here.

ccaizergues commented 2 years ago

Now I think I understand the problem.

I really thought that the particles structure contained all particles belonging to the bin to process (but effectively it's not what is written in the doc). So, here, the max is calculated within each patch, then partial results are added together for patches belonging to a bin. Which can not provide maximum in bin.

Even for additive functions, I do not clearly understand how this is performed, since a patch can contribute to several bins (especially for non-spatial bin meshes).

max is not additive, so I don't see how to make it, except by choosing bin and patch with same sizes (increasing patch size usually slow down my simulations).

How complicated/inefficient would it be to bin-group particles structures ?

mccoys commented 2 years ago

The particles are never grouped by bin. Instead, we review all the particles one by one. For each particle, we find which is the corresponding bin, and we add its contribution to the bin.

We have chosen to use the addition in all situations because it seemed the most appropriate reduction for most application (reduction function = SUM). You are basically asking for a reduction function = MAX. This is not impossible but would require some work.

Instead, could you work with TrackParticles?

ccaizergues commented 2 years ago

OK, I should try with TrackParticles, yet it is a more costly solution and not so fast usable as binning diags.

Also I would test if the bin=patch solution can make it.

ccaizergues commented 2 years ago

I tried the bin=patch solution, it seems to work well. Comparing the maximal energy in patch (E2) and the average energy (E1='weight_ekin' / 'weight'), the two maps look globally similar, and the relative difference (E2-E1)/E2 (see below) confirms that we always have E2>E1 (at all times), as expected, with more difference inside solid target than outside (heated target vs accelerated particles) which is also an expected behavior.

image