Open mjkramer opened 4 months ago
I think it could be that with a threshold of 0 you will get max_mc_truth_ids
segments for each of the 1000 ticks in the waveform for each of the sipms, and with a threshold that is non-zero you will get only max_mc_truth_ids
segments for the ticks where the peak is, which is maybe 200 ticks on average and a third of the sipms? On the other hand, this should not be the case because in sum_light_signals
there is an explicit check if a segment contributes more than 0 photons to the tick.
Edit: the size of the light truth backtracking with a threshold of 0 is (275036408,), the size with a threshold of 1 is (2965806,), the difference is ~ a factor 100
Edit 2: of the 275036408 only 74921 have a pe_current of 0
During production of the first beta of MiniRun5, we saw a major slowdown of larnd-sim, with projected run times of 3-4 hours per file (~200 spills). This was resolved by editing
2x2_mod2mod_variation.yaml
to increasemc_truth_threshold
from zero to0.001
, which resulted in more reasonable run times of 20-25 minutes. In further testing, we've seen that this threshold can be made ridiculously close to zero while maintaining good run time, e.g. going down to0.00000001
only increased the run time by a few minutes, and going down to0.0000000001
still wasn't enough to double the run time.So the questions are:
Why does
mc_truth_threshold
of exactly zero lead to an order-of-magnitude slowdown, while a threshold of "epsilon" leads to little-to-no slowdown?If there is a fundamental reason that the code can't be fixed to support a threshold of zero, what value should we use? Is 0.001 "not low enough"?
In the short term we should push a small-but-nonzero threshold to
develop
.@YifanC @marjoleinvannuland @russellphysics