henry2004y / Vlasiator.jl

Data processor for Vlasiator
https://henry2004y.github.io/Vlasiator.jl/stable/
MIT License
6 stars 4 forks source link

KL-divergence for non-Maxwellianity #132

Closed henry2004y closed 1 year ago

henry2004y commented 1 year ago

The KL-divergence seems to be a perfect fit for estimating the difference away from Maxwellian:

$$ \sum_x p(x) \ln \frac{p(x)}{q(x)} $$

where $p(x)$ is the target distribution and $q(x)$ is the reference distribution (in this case Maxwellian).

Surprisingly or not, the idea of taking a log here matches my earlier thought that the current non-Maxwellianity formula gives a biased range of values.