Closed nhejazi closed 6 years ago
That's an interesting point. It is nearly trivial to incorporate weights into individual bin learners (i.e., glm logistic regression for each bin). It might even be possible to do that by providing weights to specific sl3
learner without any modifications (needs to be verified).
However, I am not sure that this is enough. Can you provide a bit more details about your use-case? What is the role of the weights?
And thank you for opening the issue!
Thanks for the quick reply.
Hmm, I see that it would be possible to pass the weights into individual bin learners (assuming the learners support a weights
-type argument like glm
).
In terms of this particular use case, we've written a TMLE for the stochastic intervention shift parameter (Iván's), for which we rely on condensier
, but we'd like to incorporate weights into the procedure to account for an application with case-control sampling (essentially, this comes down to incorporating the IPCW-TMLE of Rose+vdL), for which we need to provide inverse weights into the various steps of the TMLE procedure. We use condensier
in fitting the propensity score but would like to provide weights into this to make the case-control sampling part work out.
For a particular application I've run into, it would be very useful to be able to incorporate a
weights
argument intofit_density
, similar in style to what's currently present in standard methods likeglm
. This would likely be rather easily accomplished by incorporating weights into the step where the likelihood is fit within thefit_density
function (I'm not familiar with the code base for this package; otherwise, I'd offer a solution via a PR and not simply open an issue).Would it be possible to incorporate this
weights
argument if it's trivial @osofr? I'd be happy to help if I can.