Closed thomas-muench closed 4 years ago
To fix this, I would suggest to check if N
is smaller than the maximum injection number found across the entire measurement file. If this is the case, the data.frame of estimated memory coefficients should be padded with 1 to match the desired length. This is a reasonable approach since this issue is only expected to occur for large maximum sample injection numbers, for which also normally estimated memory coefficients would be very close to 1.
This should be implemented right within calculateMemoryCoefficients()
; see also #38.
Is this something that you can do or do you need my help?
I will give it a try and implement the fix, but likely not until next week.
The maximum number of estimated memory coefficients is based on the maximum number of injections of the standards in standard block 1 at the beginning of the measurement. Let this number be
N
.If a normal sample, or any standard in the other blocks, is injected more than
N
times, these additional injections are not corrected in the current memory correction setup, since sample data and memory coefficients are merged by aninner_join
on the injection numbers (seeapplyMemoryCorrection()
). Thus, these injections are ignored and will be lost from the measurement data after memory correction.