Closed altavir closed 2 years ago
@gerudo7 I think if we solve our previous issues about checking all input files (#35 ) and checking the resulting distributions in the Python testing (#65 ), that will mostly satisfy this issue.
Also for testing the Python we can probably just make output data files to check distributions instead of checksums on the .pngs.
@gerudo7 for comparison with other algorithms I can try to set up a comparison with Geant4 which is basically the only other code I know of that implements something like the NR recoils after capture.
Noting here that #77 added tests for all of the inputs and #80 added tests for resulting distributions, so I believe the only thing left is the Geant4 comparison testing, which is in progress.
We now have a comparison with G4 and are testing against G4 v4.7.3
in the CI. This was added in PR #98
The provided paper includes some considerations about how the computations should be made, but there is no comparison with other existing tools. Also test suite do not include data-based tests. That means that subsequent versions could also change the result of the simulation and it could not be tracked.
Changing of physics results in different code versions is a major problem of simulation software in particle physics so I think there should be an effort to mitigate that. As well as a chapter in the paper about the comparison.