Closed mcol closed 1 month ago
Adding plot = FALSE
to the call doesn't crash, and shows the following:
> ll@data
$fading_results
FIT MEAN SD Q_0.025 Q_0.16 Q_0.84 Q_0.975 TC G_VALUE_2DAYS
1 NA NA NA NA NA NA NA NA NA
G_VALUE_2DAYS.ERROR T_0.5_INTERPOLATED T_0.5_PREDICTED T_0.5_PREDICTED.LOWER
1 NA NA NA NA
T_0.5_PREDICTED.UPPER UID
1 NA 2024-10-02-06:42.0.632379611488432
$fit
[1] "Error in lm.fit(x, y, offset = offset, singular.ok = singular.ok, ...) : \n 0 (non-NA) cases\n"
attr(,"class")
[1] "try-error"
attr(,"condition")
<simpleError in lm.fit(x, y, offset = offset, singular.ok = singular.ok, ...): 0 (non-NA) cases>
$rho_prime
MEAN SD Q_0.025 Q_0.16 Q_0.84 Q_0.975
1 NaN NA NA NA NA NA
$LxTx_table
LnLx LnLx.BG TnTx TnTx.BG Net_LnLx Net_LnLx.Error Net_TnTx Net_TnTx.Error
1 2 8.47619 6 13.90476 -6.47619 4.451215 -7.904762 7.328207
LxTx LxTx.Error TIMESINCEIRR TIMESINCEIRR_NORM TIMESINCEIRR_NORM.LOG
1 0.8192771 0.9454945 0 NaN NaN
LxTx_NORM LxTx_NORM.ERROR UID
1 1 1.15406 2024-10-02-06:42.0.632379611488432
$irr.times
[1] 0 0
It's also annoying to get that warning message:
analyse_FadingMeasurement()
setscombine = TRUE
regardless of the number of curves it has.
I am sorry, I know that this is a mean and annoying warning thrown on the C++ level, purposely bypassing the R warning system.
The reason for this is that one manufacturer produced BIN/BINX files that did not follow the specification and with it caused wrong curves when the BIN/BINX file instead of the XSYG file was used. I reported this a couple of times to the manufacturer and it got fixed at some point, but the files still exist for older software version and old measurement files and I wanted to make sure that this issue does not remain unnoticed. It does not matter for normal equivalent dose analyses, but if you do curve deconvolution or other stuff for these kind of curves the result is just rubbish but the user may think that it is a science and a new discovery.
Adding
plot = FALSE
to the call doesn't crash, and shows the following:> ll@data $fading_results FIT MEAN SD Q_0.025 Q_0.16 Q_0.84 Q_0.975 TC G_VALUE_2DAYS 1 NA NA NA NA NA NA NA NA NA G_VALUE_2DAYS.ERROR T_0.5_INTERPOLATED T_0.5_PREDICTED T_0.5_PREDICTED.LOWER 1 NA NA NA NA T_0.5_PREDICTED.UPPER UID 1 NA 2024-10-02-06:42.0.632379611488432 $fit [1] "Error in lm.fit(x, y, offset = offset, singular.ok = singular.ok, ...) : \n 0 (non-NA) cases\n" attr(,"class") [1] "try-error" attr(,"condition") <simpleError in lm.fit(x, y, offset = offset, singular.ok = singular.ok, ...): 0 (non-NA) cases> $rho_prime MEAN SD Q_0.025 Q_0.16 Q_0.84 Q_0.975 1 NaN NA NA NA NA NA $LxTx_table LnLx LnLx.BG TnTx TnTx.BG Net_LnLx Net_LnLx.Error Net_TnTx Net_TnTx.Error 1 2 8.47619 6 13.90476 -6.47619 4.451215 -7.904762 7.328207 LxTx LxTx.Error TIMESINCEIRR TIMESINCEIRR_NORM TIMESINCEIRR_NORM.LOG 1 0.8192771 0.9454945 0 NaN NaN LxTx_NORM LxTx_NORM.ERROR UID 1 1 1.15406 2024-10-02-06:42.0.632379611488432 $irr.times [1] 0 0
But this makes sense, right? Or do you want to point on something I overlook?
It's also annoying to get that warning message:
analyse_FadingMeasurement()
setscombine = TRUE
regardless of the number of curves it has.I am sorry, I know that this is a mean and annoying warning thrown on the C++ level, purposely bypassing the R warning system.
That's fine, I was referring to the actual R warning "[plot_RLum.Analysis()] Nothing to combine, object contains a single curve", which I've now fixed locally.
But this makes sense, right? Or do you want to point on something I overlook?
I thought that all those NA/NaNs were suspicious and may help finding the issue. There's something wrong with how we normalize this data, as we do:
but in our case tc = 0
: shouldn't we check for this before dividing?
I feel that we should be able to stop cleanly even if the user sets plot = TRUE
, but I need to figure out what is the condition to be checked to avoid the crash.
Ah OK. From the physical perspective, tc
is the time since irradiation and can never be <= 0. So I suggest the following bypass (without warning, without additional check):
tc <- max(1e-6, tc)
This avoids that users do something stupid and we do not bother to ask and point to it. The time here is more than justified, because we request tc in seconds, everything smaller than this (even already 1 s is virtually impossible to master; fastest possible times are around, if at all, > 10 s, more > 100 s).
Setting a minimum non-zero value makes sense.
The issue of the "object 'L' not found" error comes from this line inside a call to plot_RLum()
:
https://github.com/R-Lum/Luminescence/blob/3fd3a837252a19d8dbd1eac89bb5c9d5ce1d6621/R/analyse_FadingMeasurement.R#L668
This recalled me of something we had already seen just a few lines futher down, and the fix is again the same, that is to use bquote
around the call to expression()
.
This crashes:
It's also annoying to get that warning message:
analyse_FadingMeasurement()
setscombine = TRUE
regardless of the number of curves it has.