I wanted to let you know about something I learned about INCC last Friday.
Following repairs to the bottom bank of the CMR shuffler, we powered things up and began taking a background measurement. The first runs showed a “HV Fail 2” error, but also showed normal counts. That combination was confusing, so it took us a few minutes to identify the issue – the shuffler detectors are powered by independent HV units (which were turned on), while INCC was reporting that the AMSR HV (not used by this system) was turned off – and we ended up with roughly a dozen “HV Fail 2” runs. As shown by the number of cycles remaining, INCC ignored those runs, so we continued running the background for the full 120 x 30s good cycles.
Something interesting happened on Friday when I was certifying the drums measured that day. When I’m doing certifications, I normally acquire that day’s background from the database, and then do the same with the verification measurements, to check the results. However, when I acquired the background I discovered that the rate did not match the one reported for the original measurement. Upon examining the new background report, I realized that INCC had included the runs originally marked as “HV Fail 2”. Since the rates observed for those runs were consistent with the others, the new background was not statistically different from the original background, but it certainly did not match the reported and recorded background.
As it turned out, there was no way to reproduce the results of the original verification measurement. Including all runs of the original background set resulted in a slightly different result from before. Manually entering the original background was not a solution, because those rates were only available (from the original report and the logbook) to 3 decimal places, which is lower than the level of precision INCC uses for background subtraction, so again a slightly different result was obtained.
So the lessons learned here were:
When acquiring data from the database, runs that were invalidated during collection from the DAQ may be included in the analysis. Does this mean that the “HV Fail 2” code is not persistent? Presumably, data collected with actual zero HV would still fail other QC tests.
If you want to be able to reload a background measurement in the future, it would be best to abort a measurement containing “HV Fail 2” cycles with normal counts.
Georgiana, rather than create the potential for confusion down the road by handing in certifications with data that didn’t match what was recorded in your records, I relied on the reports from your original verifications to certify the drums measured on 6/28.
Hello all,
I wanted to let you know about something I learned about INCC last Friday.
Following repairs to the bottom bank of the CMR shuffler, we powered things up and began taking a background measurement. The first runs showed a “HV Fail 2” error, but also showed normal counts. That combination was confusing, so it took us a few minutes to identify the issue – the shuffler detectors are powered by independent HV units (which were turned on), while INCC was reporting that the AMSR HV (not used by this system) was turned off – and we ended up with roughly a dozen “HV Fail 2” runs. As shown by the number of cycles remaining, INCC ignored those runs, so we continued running the background for the full 120 x 30s good cycles.
Something interesting happened on Friday when I was certifying the drums measured that day. When I’m doing certifications, I normally acquire that day’s background from the database, and then do the same with the verification measurements, to check the results. However, when I acquired the background I discovered that the rate did not match the one reported for the original measurement. Upon examining the new background report, I realized that INCC had included the runs originally marked as “HV Fail 2”. Since the rates observed for those runs were consistent with the others, the new background was not statistically different from the original background, but it certainly did not match the reported and recorded background.
As it turned out, there was no way to reproduce the results of the original verification measurement. Including all runs of the original background set resulted in a slightly different result from before. Manually entering the original background was not a solution, because those rates were only available (from the original report and the logbook) to 3 decimal places, which is lower than the level of precision INCC uses for background subtraction, so again a slightly different result was obtained.
So the lessons learned here were:
Georgiana, rather than create the potential for confusion down the road by handing in certifications with data that didn’t match what was recorded in your records, I relied on the reports from your original verifications to certify the drums measured on 6/28.