guido-s / netmeta

Official Git repository of R package netmeta
http://cran.r-project.org/web/packages/netmeta/index.html
GNU General Public License v2.0
29 stars 13 forks source link

Better error message when getting negative variance from multiarm or potential work around #9

Closed ltrainstg closed 2 years ago

ltrainstg commented 2 years ago

Problem

The variance can sometimes still be negative and more than 0.1% and in this case the analysis seems to complete but provides no solution. I saw the fix for https://github.com/guido-s/netmeta/issues/6 and it seems like the most relevant similar issue, but there is no way to replace the cutoff and it doesn't produce an error if the variance is still negative and too large to force to 0.

Request

Either a better error that stops the analysis with some error like 'Calculated multi-arm variance is inconsistent' and maybe a parameter to increase tolerance in addition to the other 2.

Reproducing

I was looking into an error that when the units of the analysis are scaled up to much larger numbers instead of analyzed in 100,00 of units and I came across this error: (The use case is that we allow unit conversion freely by some user and sometimes they choose nm or some other small unit)

Warning messages:
1: In sqrt(1/p0$weights) : NaNs produced
2: In sqrt(1/p1$weights) : NaNs produced

The analysis seemed to work, but when plotting the forest plot it was blank. Provided below is some code for reproducing.

  1. Take a dataset and scale up the mean and standard deviation. (Senn2013 was used here)
  2. Run analysis and this one doesn't seem to trigger tol.multiarm or tol.multiarm.se.
  3. Found a subgroup that returns a negative variance from netmeta's prepare.
  4. Digging into it we find netmeta's mutliarm is returning a negative variance
library(netmeta)
data(Senn2013)

# Conduct fixed effects network meta-analysis
# Note it scales fine at 10^ 1,2,3,4, and 7 but not 5,6,and 8. 
# Not sure why this is, but my assumption that we could just linearly scale the mean and variance must be wrong. 
# Or something odd is going on with the matrix algebra 
for(i in 1:8){
TE1 = Senn2013$TE*10^i
seTE1 = Senn2013$seTE *10^i

p2 <- netmeta:::prepare(TE1, seTE1, treat1, treat2, studlab, tau)
print(any(p2$weights <=0))
}
TE1 = Senn2013$TE*10^5
seTE1 = Senn2013$seTE *10^5

## Only get a warning at this value and not an error 
net1 <- netmeta(TE1 , seTE1, 
                Senn2013$treat1.long, Senn2013$treat2.long,
                Senn2013$studlab,
                sm = "MD",
                comb.random = FALSE,
                tol.multiarm = 0.1,
                tol.multiarm.se = 0.1, 
                reference = "Placebo")
## forest is blank
forest(net1)

Looking into the prepare function for the problem subgroup

s <- "Willms1999"
subgraph <- Senn2013[studlab == s,] 
treat1 = subgraph$treat1
treat2 = subgraph$treat2
studlab = subgraph$studlab

TE = subgraph$TE
seTE = subgraph$seTE 

# Original data passes
p1 <- netmeta:::prepare(TE, seTE, treat1, treat2, studlab)
any(p1$weights <=0)
# Scaled up data fails
p2 <- netmeta:::prepare(TE*10^5, seTE*10^5, treat1, treat2, studlab)
any(p2$weights <=0)

I think the ultimate solution might be to just keep the sigfis and rescale to analyze in 100,000 of units, but still not sure why scaling caused so many issues.

guido-s commented 2 years ago

Lionel,

Thank you for sharing this issue with netmeta(). The problem was that our internal function invmat() for matrix inversion failed for such large inputs. Using ginv() from R package MASS is one possible remedy in such cases. Actually, ginv() had been used for matrix inversion in R package netmeta, version 1.3-0. We switched back to our function in netmeta, version 1.4-0 due to wrong results for some extreme network structures.

We added an informative warning in the case of negative variances for a multi-arm study and (more significantly) added an argument func.inverse to netmeta() in order to select a different R function to calculate the pseudoinverse. Typically, this will be ginv() from R package MASS.

As you can see, ginv() is working well for the Senn data set up to a scaling factor of 10^13.

data(Senn2013)
# Scale TE and seTE up by factor 10^i
m.i <- NA
for (i in 1:9) {
  rm(m.i)
  m.i <-
    try(netmeta(10^i*TE,10^i*seTE,treat1,treat2,studlab, data = Senn2013),
        silent = TRUE)
  cat(paste0("*** 10^", i, ": percentage weights under RE model ***\n"))
  if (inherits(m.i, "netmeta"))
    print(round(m.i$w.random / sum(m.i$w.random), 4))
  else
    print(m.i)
}
warnings()
# Use R function ginv() to calculate pseudoinverse
m.i <- NA
for (i in 1:15) {
  rm(m.i)
  m.i <-
    try(netmeta(10^i*TE,10^i*seTE,treat1,treat2,studlab, data = Senn2013,
                func.inverse = MASS::ginv),
        silent = TRUE)
  cat(paste0("*** 10^", i, ": percentage weights under RE model ***\n"))
  if (inherits(m.i, "netmeta"))
    print(round(m.i$w.random / sum(m.i$w.random), 4))
  else
    print(m.i)
}
#
netmeta(10^1 * TE, 10^1 * seTE, treat1, treat2, studlab,
        data = Senn2013, func.inverse = MASS::ginv,
        fixed = FALSE)
#
netmeta(10^13 * TE, 10^13 * seTE, treat1, treat2, studlab,
        data = Senn2013, func.inverse = MASS::ginv,
        fixed = FALSE)

Best, Guido

ltrainstg commented 2 years ago

Thanks for getting to this so quickly. Just tried this out and it covers our use case.