Open jmboehm opened 4 years ago
I haven't done the algebra, but I would think (hope?) that it's not so hard. In general, you'd have to see how your weights are entering the likelihood function, and whether the equivalence between the iteration step and the weighted linear regression is still present (https://arxiv.org/pdf/1707.01815.pdf). See also this issue in alpaca.
How much effort would this be? I'm currently trying to fit a logistic regression on ~2bn observations, so need to group the data into a Bernoulli format with trials/successes by group and then fit a model to the success rate, using trial counts as weights. This works with GLM.jl, but I also need 5 fixed effects, one with 10,000 levels, so I'm
OutOfMemory
. I don't thinkalpaca
supports weights (or that alternativeglm
formulation in R where one passes an n-by-2 matrix of successes and failures as LHS variable), is there anything in the algorithm used that makes this particularly tricky?