mctools / ncrystal

NCrystal : a library for thermal neutron transport in crystals and other materials
https://mctools.github.io/ncrystal/
Other
38 stars 17 forks source link

Material with large atom numbers will result in error when using the default d-spacing 0.2 Angstrom #160

Open XuShuqi7 opened 6 months ago

XuShuqi7 commented 6 months ago

Hi @tkittel,

I am using NCrystal-v-3.7.1 in MAC OSX system.

The material ncmat file can be found in: https://github.com/highness-eu/ncmat-clathrates/blob/main/HighNESS_1088D2O_192O2_sg1_ClathrateHydrate-sII.ncmat which is composed of over 3000 atoms in a cubic lattice with a side length of ~35 Angstrom.

When using ncrystal_inspectfile HighNESS_1088D2O_192O2_sg1_ClathrateHydrate-sII.ncmat, the result is


==> Normalised cfg-string : "HighNESS_1088D2O_192O2_sg1_ClathrateHydrate-sII.ncmat"
Traceback (most recent call last):
  File "/usr/local/bin/ncrystal_inspectfile", line 860, in <module>
    main()
  File "/usr/local/bin/ncrystal_inspectfile", line 807, in main
    sc_obj = NC.createScatter(normcfg)
  File "/usr/local/share/NCrystal/python/NCrystal/core.py", line 1442, in createScatter
    return Scatter(cfgstr)
  File "/usr/local/share/NCrystal/python/NCrystal/core.py", line 1287, in __init__
    self._rawobj_scat = _rawfct['ncrystal_create_scatter'](_str2cstr(cfgstr))
  File "/usr/local/share/NCrystal/python/NCrystal/_chooks.py", line 191, in fcte
    _raise_err()
  File "/usr/local/share/NCrystal/python/NCrystal/_chooks.py", line 169, in _raise_err
    raise e
NCrystal.exceptions.NCCalcError: Combinatorics too great to reach requested dcutoff = 0.2 Aa```

While ```ncrystal_inspectfile "HighNESS_1088D2O_192O2_sg1_ClathrateHydrate-sII.ncmat;dcutoff=0.5"``` will run correctly. 

Therefore, I think it might be good that NCrystal could adjust automatically the dcutoff for material with large atom numbers.

Thank you for your consideration.

Shuqi
tkittel commented 6 months ago

Thanks for the report @XuShuqi7 !

Indeed, the heuristics behind the automatic selection of d-spacing cut-offs are extremely crude and were never tested on such large unit cells.

I am, however, going to mark this as a feature request rather than bug, since it is actually working as designed and documented right now. Even though the design might be inadequate :-)

tkittel commented 6 months ago

Admittedly, the documentation could also be more clear. I had to go to the release 2.7.0 announcement page to find it:

As a result, the heuristics picking a default dcutoff value were updated so that structures with more than 40 atoms per unit cell now defaults to dcutoff=0.2 (angstrom), as opposed to dcutoff=0.25 before (the default value for other materials remain dcutoff=0.1).

So we can improve it with a wild guess (more than 400 atoms means dcutoff=0.5??). If you have a way to test a few more materials to see when the increased value might be needed, it could be useful.

XuShuqi7 commented 6 months ago

From my side, I only have clathrate hydrates which possess large number of atoms (around 3000). While I think the selection of dcutoff depends also on the "regularity" of the crystalline structure. In extreme cases, one can have large lattice containing large number of atoms with some of which shift slightly from their regular positions. In this case, the dcutoff=0.5 may not be adequate.

Hence, I think maybe a better way is to inform the user to adjust the dcutoff when the number of atom exceeds 400 for example. What do you think?

tkittel commented 6 months ago

Yes, I completely agree that one can not simply predict an appropriate dcutoff value based on the number of atoms. My idea was that in the absence of a good model able to predict a sensible dcutoff value for any unit cell, I chose a model which was at least very simple to describe in words (i.e. 0.1Å or 0.2Å depending on whether or not natoms>=40). Of course, such a simple model is bound to fail.

The best solution might be to have a better model for predicting the dcutoff, but coming up with such a solution would take some time.

So a more realistic solution for the short term might simply be to make the error message more clear? I.e. instead of saying Combinatorics too great to reach requested dcutoff = 0.2 Aa it could say Combinatorics too great to reach requested dcutoff = 0.2 Aa (you can try to adjust the value with the dcutoff parameter) or something like that?

XuShuqi7 commented 6 months ago

The modification for the error message looks good 👍.

I just went back to the equations for the calculation of coherent elastic scattering cross section. I think to some extent we are dealing with a convergence problem whose form is similar to $\sum_i\dfrac{\sigma_i}{E}$. However, as you mentioned, a more delicate model needs some time to investigate and implement.

tkittel commented 2 weeks ago

Hi @XuShuqi7. I am not sure which particular convergence you have in mind, but for sure it would be great with a more careful analysis.

For the next release I will update the error message as discussed :-)