None of our weighted DIM estimators have the same standard errors as lm_robust(), except for the clustered and block-clustered cases which match because they farm all the work out to lm_robust (there may still be degrees of freedom discrepancies anyways).
Currently, I am able to replicate weights::wtd.t.test() and fits with the estimates I've been able to find online, but this doesn't match lm_robust(..., se_type = "HC2")
devtools::install_github("DeclareDesign/estimatr", ref = "weightdim")
n <- 8
dat <- data.frame(y = rnorm(n), z = c(0, 1), w = runif(8))
lm_robust(y ~ z, data = dat, weights = w)
difference_in_means(y ~ z, data = dat, weights = w)
with(dat, weights::wtd.t.test(y[z==1], y[z==0], w[z==1], w[z==0])
What underlies weights::wtd.t.test() are the same weighted variance and means we get from SDMtools that @acoppock linked in Slack.
We can:
a) figure this out
b) farm all weighted estimation to lm_robust, erroring if there are matched pairs and weights as lm_robust can't accomadate this case (Imai et al talk about weights, but for specific estimands)
c) basically error whenever someone tries to do weighted d-i-m for now and (i) do nothing or (ii) transparently tell them they should use lm_robust
None of our weighted DIM estimators have the same standard errors as
lm_robust()
, except for the clustered and block-clustered cases which match because they farm all the work out to lm_robust (there may still be degrees of freedom discrepancies anyways).Currently, I am able to replicate
weights::wtd.t.test()
and fits with the estimates I've been able to find online, but this doesn't matchlm_robust(..., se_type = "HC2")
What underlies
weights::wtd.t.test()
are the same weighted variance and means we get fromSDMtools
that @acoppock linked in Slack.We can:
a) figure this out b) farm all weighted estimation to lm_robust, erroring if there are matched pairs and weights as lm_robust can't accomadate this case (Imai et al talk about weights, but for specific estimands) c) basically error whenever someone tries to do weighted d-i-m for now and (i) do nothing or (ii) transparently tell them they should use lm_robust