ngreifer / WeightIt

WeightIt: an R package for propensity score weighting
https://ngreifer.github.io/WeightIt/
102 stars 12 forks source link

Error when user provide a vector of propensity scores #36

Closed sky0502 closed 2 years ago

sky0502 commented 2 years ago

Hi,

The motivation of my issue is to discard units outside a region of common support, similar to discard in {MatchIt}. Since there is no such command in {WeightIt}, I chose to calculate the ps first then pass on to weightit. I encountered an error if a factor only have one level left after discarding units. Please see example below.

library(WeightIt)
library(cobalt)
data("lalonde", package = "cobalt")
ps_mod <- glm(treat ~ age + educ + race, family = binomial(), data = lalonde)
ps_score <- predict(ps_mod, type = "response")
discard <- lalonde$race %in% c("black", "hispan")
w_out <- weightit(treat ~ age + educ + race, data = lalonde[!discard, ], ps = ps_score[!discard])
# Error in `contrasts<-`(`*tmp*`, value = contr.funs[1 + isOF[nn]]) : 
#   contrasts can be applied only to factors with 2 or more levels

I looked into the source code and discovered that the error was produced by model.matrix in get.covs.and.treat.from.formula when calling weightit2ps. However, I didn't see this design matrix being used subsequently but I could be wrong. Is there a way to get around the error?

Thanks!

ngreifer commented 2 years ago

This is because you factor variable (race) as a predictor which, after subsetting, only has one level ("white"). R cannot process factors with only one level very well. This is not just true of WeightIt; the same thing will happen if you try to use lm() or glm() or any other related function with the same input.

To solve the issue, don't include the subsetting variable in the model formula if it only has one level after the subset. I've edited the internal code of weightit() to avoid this problem in the future using the solution described here, but it might be a while before the updated version is uploaded to CRAN.

sky0502 commented 2 years ago

Thank you so much!