Open tkittel opened 1 year ago
Interesting topic. I will leave here a couple of ideas:
For the resonant scattering problem, the methodology in Courcelle and Rowlands, 2007 is probably the best suited for the existing NCrystal infrastructure. I think that paper is a good start point.
To improve energy dependent absorption, maybe it would be useful to provide an (optional) link to Papillon NDL, perhaps as a plugin. That way McStas and other codes without native ACE support could read and sample ACE files.
Thanks for the input @marquezj ! The Courcelle-Rowlands paper looks interesting, although if I understand it correctly, it does sound like such an approach would be unable to utilise the existing bragg diffraction infrastructure? Still, it might be something to consider since the alternative (initialise structure factors are different neutron energies) is also not great.
For energy dependent absorption (and scaling of incoherent scattering) I had in mind something very simple: Simply hardcode (or make tunable) the 50-100 (?) 1D scaling curves for the most affected isotopes (we only care below a few eV), and apply those. Should be super simple and fast, and require <50kB of data. A plugin to read ACE files sounds like a different thing (it might of course still be interesting to have optional ACE support!). Also recall here that absorption in NCrystal and McStas is only ever a total cross section, the outcomes in terms of production of secondaries are never considered, so using more than 100 points on a 1D curve is a bit of a waste :-)
Actually, thinking aloud, the most expensive part of structure factor initialisation is the calculation of neutron wave phases at each unit cell site (i.e. calling sin
and cos
). Those should not depend on scattering lengths, but only site coordinates and d-spacing. Perhaps we could simply cache those before calculating structure factors at different neutron energies?
Thinking further about this and elaborating a bit on my previous comment, I think it should actually be quite possible to implement a Bragg (powder) diffraction model which should be reasonably fast at both initialisation (<0.5s?) and later cross section evaluation (>10MHz), at the price of a bit of memory (but <1MB). The point is the math ($b_i$ is the coherent scattering length of the atom at the i'th site, $W_i(\bar q)$ is the Debye-Waller function, $\delta_i^2q^2/2$ in the isotropic case, $\delta_i$ the mean-squared-displacement of the atom, $r_i$ the position of the atom in the cell):
$$ F{\bar\tau{hkl}}=\sum_{i \text{(all sites)}}b_i\exp\left[-Wi(\bar\tau{hkl})\right]\exp\left[i\bar\tau_{hkl}\cdot\bar{r}_i\right] $$
We need to ultimately use the square $\left|F{\bar\tau{hkl}}\right|^2$, so we get interference terms. Now, the equation above can be broken into sub-sums over all sites with the same scattering length $b_j$, or $bj(E{\text{neutron}})$ in case of non-constant scattering lengths:
$$ F{\bar\tau{hkl}}=\sum_{j \text{(atom types)}}bj\left(\sum{i \text{(all sites of type $j$)}}\exp\left[-Wi(\bar\tau{hkl})\right]\exp\left[i\bar\tau_{hkl}\cdot\bar{r}_i\right]\right) $$
The calculation of the complex number inside the large parenthesis above is what is most expensive during initialisation, and fortunately the result of it can be expressed as a single (complex) number which does not depend on the scattering length. Therefore, we can pre-calculate these numbers, which should only be about as time-consuming as the work we do to calculate structure factors in a material with constant $b_j$. We can furthermore sum up the terms (including scattering length factors) of any atom types which are appropriate to model with a constant $b_j$.
So to give an example: if for instance we have a very complicated material with 20 different atom types and 1000 unit cell sites, but only 3 of those atom types have a need to be modeled with energy, then we simply need to store 4 complex numbers (i.e. 8 doubles) for the HKL plane: 1 complex number which contain all the terms in the previous equation of atom types with constant $bj$, and 1 complex number of each of the three atom types with energy-dependent scattering lengths. So even with, say, 10000 relevant symmetry-nonequivalent-non-systematically-absent planes, that will just consume 640kB of data in a few vectors (which will of course be arranged in memory for optimal CPU/mem latency). It should then be blazingly fast to calculate the actual $\left|F{\bar\tau_{hkl}}\right|^2$ at a particular neutron energy.
The next question concerns what to do with those, to arrive at an actual Scatter model for NCrystal which can provide cross-section and sampling for any particular neutron energy provided by client code. There are several avenues here, and I am not done thinking about it. But in the following I will mention a few obvious options.
For an idealised, memory efficient, but slow (xsect eval speed being $O(N\text{planes})$ rather than the usual $O(\log(N\text{planes}))$ model, one could then simply cache those 640kB of data and calculate and apply structure factors on-the-fly for any actual neutron energy.
For a faster, slightly less accurate and possibly memory inefficient model, one could simply initialise old-school powder Bragg models at 100 different neutron energies (the exact number adapted to curves in question and luxury level required), and then simply interpolate between 2 of those when needed (note all models have the same d-spacing of all planes, so there will be no interpolation across "bragg edges" nastiness, it will all be very smooth). This would be roughly 2 times slower (perhaps in practice 3 times slower) than what we normally do for a material with constant scattering lengths, which means $O(\log(N_\text{planes}))$ and blazingly fast. For sampling as opposed to mere cross section evaluation one would have to be a bit more careful, but I think it can also be done.
It might ultimately be that a combination of the two methods above is most appropriate (using the idealised model for the few longest d-spacing planes which we imagine are most critical for many users, and the interpolation model for the rest). I am also not entirely happy with the memory consumption of 100 bragg diffraction models, but on the other hand it might still be less than an inelastic scattering kernel.
All in all, I don't see any reasons why we can't just go ahead and implement this. Apart from time :-)
As a follow up to my previous comment, I should mention that part of the time-requirements will come from the fact that there will be some refactoring needed of the structure-factor calculation code, and there will be some interface changes needed since our Info objects will no long simply have-or-have-not HKLInfo, but might now also have energy-dependent HKLInfo, which might require downstream changes in quite a lot of code.
As discussed with various people over the years (@dddijulio , @marquezj , @rozyczko, @AlexanderBackis ... ?) it might be interesting to add some sort of improved modelling of energy-dependent scattering and absorption length, at least for those isotopes with low-E resonances. For the case of absorption or incoherent-scattering models, this would be implemented as a simple scaling, but of course there are non-trivial issues with coherent scattering. One option for Bragg diffraction would be to initialise structure factors at multiple E-values (possibly with a finer E-grid in the case of longer-d-spacing planes).
This won't happen this year, but keeping this issue here to remind us and solicit input in the meantime.