Open Jahrzehnte opened 1 month ago
Hello @Jahrzehnte ,
sorry for a late response. About warnings--i would double-check what is the density after the first FBA step, as mentioned here https://github.com/VasiliBaranov/packing-generation?tab=readme-ov-file#22-program-usage-and-options-for-generation in point 1. Maybe it is below 0.4? You can try to change FBA parameters to ensure slightly higher densities. For parameter choice, please see the final paragraph under https://github.com/VasiliBaranov/packing-generation?tab=readme-ov-file#22-program-usage-and-options-for-generation . You can also check the Jupyter notebook that another user uploaded to the repository, https://github.com/VasiliBaranov/packing-generation/blob/master/packing_generation.ipynb
For difficulties with meshes--yes, the distances between particles can be extremely small, especially after the LS or LSGD compression. You can understand it based on the Salsburg-Wood equation of state, reduced pressure Z = 1 + 3 / (phi_Jamming / phi - 1). For Z=1e12, which we achieve, diameter ~ diameter when all particles are in contact (jamming) * (1 - 1e-12). Here is a paper with more details about this EOS https://pubs.aip.org/aip/adv/article/11/3/035311/992843/Beyond-Salsburg-Wood-Glass-equation-of-state-for . Maybe you can try to run simulation with zero compression rate (MD parameter), so that the packing "relaxes" a little bit and particles fly from each other. But meshing is in general difficult, and if you want to run fluid dynamics simulations with the finite volume method, you will face problems (already during mesh creation) :-)
In the lab of the professor where i did the PhD, we used the Lattice Boltzmann Method for fluid dynamics simulations exactly for this reason (finite volumes fail already on the mesh creation step). https://en.wikipedia.org/wiki/Lattice_Boltzmann_methods . You simply discretize the spheres, but simulation results will depend on the resolution, and you have to run them on several resolutions and see when the results converge. We had custom C++ code that ran on supercomputers. There are some open-source implementations as well, this was pretty mature some time ago https://palabos.unige.ch/, but maybe there are alternatives now. Some run on GPUs as well, from what i know (LBM is very easy to parallelize). This is one paper from my former lab https://www.sciencedirect.com/science/article/abs/pii/S0021967315008948 There are also some tricks to keep resolution low but get proper results, https://www.sciencedirect.com/science/article/abs/pii/S0021999114007207
Hope this helps!
Hello Dr. Baranov,
We have been following the complete generation steps (-fba, -ls, -lsgd) to create varying sizes of monodisperse packings, without specifying diameters in a
diameters.txt
file. However, when scaling up to 10,000 beads with a packing size of10*10*100
, although we consistently receive the finalpacking.nfo
, we encounter frequent similar warnings during the lsgd process for all our 10,000 beads packings:WARNING: innerDiameterRatio incorrect. Time: 0. Actual: 1.073995606244491, expected: 1.073995606244481. Closest pair: 6599, 9271
In addition, we're facing significant challenges with meshing for packings involving 5,000 or 10,000 beads. We can generate meshes for smaller cases but not for larger cases. Our initial generation.conf is set with a constant contraction rate of
1.328910e-5
across all steps, as detailed below:Could you please offer any insights on how to address these frequent warnings? Also, concerning the meshing difficulties with larger packings, might these issues be related to geometric factors such as overlaps or unexpectedly narrow gaps between beads? Are there any newer techniques or adjustments you could recommend to help us resolve these issues?
Any help would be greatly appreciated!