Open Chelsea486MHz opened 5 years ago
The first thing that comes to mind is Poisson disc sampling. https://www.jasondavies.com/poisson-disc/
The second thing that comes to mind is kind of splitting the space into an amount of boxes which roughly corresponds to the amount of particles. This would make a lattice, and most of the points on the lattice would be assigned a particle. Then, maybe random offsets occur. Kind of like how Minecraft does that offset thing with flowers.
I could probably get to it over my next break, just a few weeks from now.
I actually started implementing the maths behind a probable solution.
The current potential reduction mechanism moves a single particle by a random vector.
My proposed solution would be in two steps:
Before inserting the substrate in the simulation universe, create a buffer universe and insert a single copy of the substrate in it. Run the current 2-stage potential reduction algorithm on it, until it is sufficiently stable. Then insert the required number of substrate copies from this buffer universe to the simulation universe.
For each substrate copy in the simulation universe, move them one by one to a buffer universe with the origin on the center of mass of the substrate, then apply a random matrix transform, either a rotation matrix (axis and angle randomized) or a translation matrix. Then move them back from the buffer universe to the simulation universe, at their previous location. This allows selectively applying matrix transforms on entire substrates in the simulation universe and save time during potential reduction by moving atoms in bulk instead of one-by-one.
I'm about to start implementing this feature in a local testing build.
Some STP water for you thirsty gamer boys
Let's consider the following simulation, 100 water molecules at STP:
senpai --in examples/water.mol --out render.xyz --copy 100
The
--copy
flag tells SENPAI to insert100
copies of the substrate described inexamples/water.mol
into the universe. But what is the process behind this?SENPAI's way of loading the simulation
universe_init
is responsible for initialising the universe before the simulation. It itself callsuniverse_load
, which loads atoms from the input file into a special array,universe->ref_atom
. This array hasuniverse->ref_atom_nb
elements. It contains each atom from the input file, and their information.universe_init
will then calluniverse_populate
, a function that places copies ofuniverse->ref_atom
at random locations within the universe.The universe, as a set of matter, is nothing but an array,
universe->atom
, of size(args->copies)*(universe->ref_atom_nb)
. Theuniverse->atom
array therefore is a concatenation of several modifieduniverse->ref_atom
arrays. The modification I was referrencing is the introduction of an offset, identical between all copies of theuniverse->ref_atom
array.TL;DR: SENPAI places copies of the loaded substrate at random locations within space.
The issue here is "at random". SENPAI doesn't care if atoms are located at the same physical coordinates. As such, another step is required before starting the simulation: potential reduction.
SENPAI will apply algebraic transformations to the coordinates of each atom within space, so as to lower the potential energy. This process can be tuned through
headers/config.h
:It is mathematically impossible to find the absolute minima of the potential function. As such, SENPAI has to rely on a randomization stage AND a gradient descent stage to properly reduce the potential.
However, this method is extremely computationally intensive. So much that I considered moving it out of SENPAI into another project that would spit out a
.mol
file that SENPAI could load and not care about. But for now, it shall stay here.The issue being described here is the following: this method is inefficient, and sometimes SENPAI will spend more time reducing the potential than actually simulating the universe. This has to change, any ideas?