Closed ncherrier closed 3 years ago
For your first question, they say that instance level weights are on their backlog: #54
Not sure if this also includes functionality for class level weights, which may be more commonly used
Hi @ncherrier,
@peractio is correct, instance level weights are something we're actively working on :). We can look into adding support for both sample_weights and class_weights if that would be useful!
And a last unrelated question: will you consider the option of adding alternatives for fitting GAMs (other than boosting, for instance using splines)?
Good question! This has come up a few times, and we've been prototyping a few different GAM fitting methods we could potentially introduce into the package (ex: spline fitting). It's good to hear that it would be useful for you too -- we'll appropriately reprioritize!
-InterpretML Team
Great, thank you for the answers!
Thanks for the great package. Is there any update on this particular enhancement? Sample weights are a key feature of many models, and I would very much like to try InterpretML, but have been waiting for sample_weight support to be added.
Thanks.
Second the ask about adding instance/sample weights. Any update?
Hi @interpret-ml
I'm really enjoying exploring your package, great work! I see that this issue hasn't progressed since 2019, although I do see sample_weight arg incorporated into the BaseEBM.fit method in the develop branch. What's holding it back from being pushed to master?
Thanks,
Andrew
Sample weights should be pushed to master and released on pypi imminently. We're currently testing the feature, and others prior to the release. As you noticed, the changes are already in the develop branch and should work for you if you want to build from source as described in issue https://github.com/interpretml/interpret/issues/237
-InterpretML team
To update this thread: the latest release of interpret (0.2.5) now has support for sample weights in ExplainableBoostingMachines.
You can pass in positive floating point weights to the new sample_weight
parameter of the ebm.fit()
call. sample_weight
should be the exact same shape and dimension as y
-- one weight per sample. Here's a quick usage example:
from interpret.glassbox import ExplainableBoostingRegressor
ebm = ExplainableBoostingRegressor()
ebm.fit(X, y, sample_weight=w)
You can also see more in our documentation: https://interpret.ml/docs/ebm.html#explainableboostingclassifier
To upgrade interpret using pip: pip install -U interpret
Let us know if you run into any issues! -InterpretML Team
Hi!
First thanks for your amazing work. Would it be possible to take into account weights on the data samples, similar to what is done in a few models of sklearn? And a last unrelated question: will you consider the option of adding alternatives for fitting GAMs (other than boosting, for instance using splines)?
Thanks,