Open mattwarkentin opened 1 year ago
Just wanted to drop a note that I've seen this and appreciate the thorough issue description! Related to #54. We've been benchmarking some variants on this generalization and still have some work to do before we'd feel confident moving forward with an implementation.
Related StackOverflow post: https://stackoverflow.com/questions/77872965/using-user-defined-weights-for-an-ensemble-model
Other interesting post, doing ensembles manually https://www.mm218.dev/posts/2021/01/model-averaging/
Seconding this feature request! Stacks is beautifully fast but I'd love a native way to build a stacked ensemble from a workflowsets trained object that uses finetune::tune_race_anova
and finetune::control_race
. 🙏
Hi @simonpcouch,
I love the
{stacks}
package and over the last several months have thought about it quite a bit and whether there is room to broaden the API to be more flexible. It seems to me that the current API is opinionated on a few things:tune::tune_grid()
I am wondering if there is interest in a function one level lower than
blend_predictions()
that is more flexible on the three design considerations described above. Most importantly, a more general API for stacking would allow users to take advantage of the huge breadth of models available throughparsnip
et al. for stacking predictions (e.g., random forest, XGBoost, etc.). In theory, any model that supports themode
would be a candidate for the stacking model.Without actually considering the implementation too much, I image some function, let's call it
stack_predictions()
, because I don't have a better name off the top of my head, that looks something like:What do you think? This way the user can control the stacking more finely and
blend_predictions()
would be a special case ofstack_predictions()
and could potentially call this function internally. That way if you wanted to stack with a random forest, tune with{finetune}
, and use 100 Monte Carlo resamples, you could do something like:I have thought about this a few times and figured it was worth going full stream of consciousness and laying it all out for you to think about. Happy to chat more and think about this more thoroughly. As always, happy to contribute and not just request features.
While I'm thinking about it, in order to support more stacking models, there needs to be a way to define what it means to be a "non-zero stacking coefficient" for models for which coefficients don't really exist (e.g., Random Forest). Perhaps for tree-based models, if a model's prediction are used for a split in any tree, it is "non-zero" - this requires some more thinking.