Open XuShuqi7 opened 6 months ago
Thanks for the report @XuShuqi7 !
Indeed, the heuristics behind the automatic selection of d-spacing cut-offs are extremely crude and were never tested on such large unit cells.
I am, however, going to mark this as a feature request rather than bug, since it is actually working as designed and documented right now. Even though the design might be inadequate :-)
Admittedly, the documentation could also be more clear. I had to go to the release 2.7.0 announcement page to find it:
As a result, the heuristics picking a default dcutoff value were updated so that structures with more than 40 atoms per unit cell now defaults to dcutoff=0.2 (angstrom), as opposed to dcutoff=0.25 before (the default value for other materials remain dcutoff=0.1).
So we can improve it with a wild guess (more than 400 atoms means dcutoff=0.5??). If you have a way to test a few more materials to see when the increased value might be needed, it could be useful.
From my side, I only have clathrate hydrates which possess large number of atoms (around 3000). While I think the selection of dcutoff depends also on the "regularity" of the crystalline structure. In extreme cases, one can have large lattice containing large number of atoms with some of which shift slightly from their regular positions. In this case, the dcutoff=0.5 may not be adequate.
Hence, I think maybe a better way is to inform the user to adjust the dcutoff when the number of atom exceeds 400 for example. What do you think?
Yes, I completely agree that one can not simply predict an appropriate dcutoff value based on the number of atoms. My idea was that in the absence of a good model able to predict a sensible dcutoff value for any unit cell, I chose a model which was at least very simple to describe in words (i.e. 0.1Å or 0.2Å depending on whether or not natoms>=40). Of course, such a simple model is bound to fail.
The best solution might be to have a better model for predicting the dcutoff, but coming up with such a solution would take some time.
So a more realistic solution for the short term might simply be to make the error message more clear? I.e. instead of saying Combinatorics too great to reach requested dcutoff = 0.2 Aa
it could say Combinatorics too great to reach requested dcutoff = 0.2 Aa (you can try to adjust the value with the dcutoff parameter)
or something like that?
The modification for the error message looks good 👍.
I just went back to the equations for the calculation of coherent elastic scattering cross section. I think to some extent we are dealing with a convergence problem whose form is similar to $\sum_i\dfrac{\sigma_i}{E}$. However, as you mentioned, a more delicate model needs some time to investigate and implement.
Hi @XuShuqi7. I am not sure which particular convergence you have in mind, but for sure it would be great with a more careful analysis.
For the next release I will update the error message as discussed :-)
Hi @tkittel,
I am using NCrystal-v-3.7.1 in MAC OSX system.
The material ncmat file can be found in: https://github.com/highness-eu/ncmat-clathrates/blob/main/HighNESS_1088D2O_192O2_sg1_ClathrateHydrate-sII.ncmat which is composed of over 3000 atoms in a cubic lattice with a side length of ~35 Angstrom.
When using
ncrystal_inspectfile HighNESS_1088D2O_192O2_sg1_ClathrateHydrate-sII.ncmat
, the result is