uncscode / particula

a simple, fast, and powerful particle simulator
https://uncscode.github.io/particula
MIT License
5 stars 9 forks source link

Time-step of a particle distribution with coagulation #135

Closed Gorkowski closed 2 years ago

Gorkowski commented 2 years ago

Adding coagulation with time steps.

Given a log-normal particle distribution, 10^9 particles/cc, mode 100 nm, geometric sigma 1.5.

1) Calculate the production and loss of particles at each radius for a given delta time 2) update the distribution with the given production/loss 3) repeat until the specified time ends.

Checks-

  1. the total mass concentration should be conserved.
  2. smaller particles are lost to larger ones.
ngam commented 2 years ago

just putting my scratch notes here lest I lose them to conveniently construct the lognormal dists. We need to add scipy as a dependency maybe or we can just roll with numpy alone...

import numpy as np
from scipy.stats import lognorm
from matplotlib import pyplot as plt

dp = np.logspace(1,3,100)

plt.semilogx(dp, lognorm.pdf(x=dp, s=np.log(1.5), scale=100),'--')
plt.semilogx(dp,(1/(np.sqrt(2*np.pi)*np.log(1.5)*dp))*np.exp(-(np.log(dp)-np.log(100))**2/(2*(np.log(1.5)**2))))

plt.semilogx(dp,(1/(np.sqrt(2*np.pi)*np.log(1.5)))*np.exp(-(np.log(dp)-np.log(100))**2/(2*(np.log(1.5)**2))))
plt.semilogx(dp, dp*lognorm.pdf(x=dp, s=np.log(1.5), scale=100),'--')

the above defines the so-called lognormal distribution (pdf only) from a mode (100) and geometric standard deviation (1.5; equivalent to standard deviation log(1.5)). Very unfortunately, the "wrong" representation is the more common representation in our field. That is, lognorm.pdf(x=dp, s=np.log(1.5), scale=100) is the correct probability density function; but you often will see this form dp*lognorm.pdf(x=dp, s=np.log(1.5), scale=100) which is scaled by dp. This results in a few annoying things that one would have to keep in mind in general when dealing with these things --- most important is to avoid switching back and forth as the integral equations will be totally messed up; dp is the main variable in all these calculations, so dividing/multiplying by it has huge consequences that should be carefully managed throughout.

ngam commented 2 years ago

Rough plan

  1. define class to construct a "distribution" of particles:
    1. define util to construct pure dist (pdf)
    2. define util for cutoffs
    3. define util for interps
  2. define new class for coagulation stepping
    1. feed attrs from dist class (radius, etc.) into coag util call
    2. store and tidy up for next step
    3. repeat
  3. define util/class for the time step itself (this will be the pure "solver" or whatever we wanna call it)

Many of the above are already implemented elsewhere. For example, we could just use scipy.stats.lognorm.pdf for our dist, but we could equally just define it. More interesting/challenging questions about in-house vs ready-made usage will surface when dealing with highly customized solvers for point 3 above. For now, we can just make a minimal solver, but we should explore what people have done and borrow as needed...