Closed ltrainstg closed 2 years ago
Lionel,
Thank you for sharing this issue with netmeta(). The problem was that our internal function invmat() for matrix inversion failed for such large inputs. Using ginv() from R package MASS is one possible remedy in such cases. Actually, ginv() had been used for matrix inversion in R package netmeta, version 1.3-0. We switched back to our function in netmeta, version 1.4-0 due to wrong results for some extreme network structures.
We added an informative warning in the case of negative variances for a multi-arm study and (more significantly) added an argument func.inverse to netmeta() in order to select a different R function to calculate the pseudoinverse. Typically, this will be ginv() from R package MASS.
As you can see, ginv() is working well for the Senn data set up to a scaling factor of 10^13.
data(Senn2013)
# Scale TE and seTE up by factor 10^i
m.i <- NA
for (i in 1:9) {
rm(m.i)
m.i <-
try(netmeta(10^i*TE,10^i*seTE,treat1,treat2,studlab, data = Senn2013),
silent = TRUE)
cat(paste0("*** 10^", i, ": percentage weights under RE model ***\n"))
if (inherits(m.i, "netmeta"))
print(round(m.i$w.random / sum(m.i$w.random), 4))
else
print(m.i)
}
warnings()
# Use R function ginv() to calculate pseudoinverse
m.i <- NA
for (i in 1:15) {
rm(m.i)
m.i <-
try(netmeta(10^i*TE,10^i*seTE,treat1,treat2,studlab, data = Senn2013,
func.inverse = MASS::ginv),
silent = TRUE)
cat(paste0("*** 10^", i, ": percentage weights under RE model ***\n"))
if (inherits(m.i, "netmeta"))
print(round(m.i$w.random / sum(m.i$w.random), 4))
else
print(m.i)
}
#
netmeta(10^1 * TE, 10^1 * seTE, treat1, treat2, studlab,
data = Senn2013, func.inverse = MASS::ginv,
fixed = FALSE)
#
netmeta(10^13 * TE, 10^13 * seTE, treat1, treat2, studlab,
data = Senn2013, func.inverse = MASS::ginv,
fixed = FALSE)
Best, Guido
Thanks for getting to this so quickly. Just tried this out and it covers our use case.
Problem
The variance can sometimes still be negative and more than 0.1% and in this case the analysis seems to complete but provides no solution. I saw the fix for https://github.com/guido-s/netmeta/issues/6 and it seems like the most relevant similar issue, but there is no way to replace the cutoff and it doesn't produce an error if the variance is still negative and too large to force to 0.
Request
Either a better error that stops the analysis with some error like 'Calculated multi-arm variance is inconsistent' and maybe a parameter to increase tolerance in addition to the other 2.
Reproducing
I was looking into an error that when the units of the analysis are scaled up to much larger numbers instead of analyzed in 100,00 of units and I came across this error: (The use case is that we allow unit conversion freely by some user and sometimes they choose nm or some other small unit)
The analysis seemed to work, but when plotting the forest plot it was blank. Provided below is some code for reproducing.
Looking into the prepare function for the problem subgroup
I think the ultimate solution might be to just keep the sigfis and rescale to analyze in 100,000 of units, but still not sure why scaling caused so many issues.