dreamquark-ai / tabnet

PyTorch implementation of TabNet paper : https://arxiv.org/pdf/1908.07442.pdf
https://dreamquark-ai.github.io/tabnet/
MIT License
2.55k stars 470 forks source link

Making the computation of feature importance optional (see Issue #493) #494

Closed CesarLeblanc closed 1 year ago

CesarLeblanc commented 1 year ago

What kind of change does this PR introduce?

A new feature: now, the computation of feature importance can be disabled if you fit your model with the parameter compute_importance=False

Does this PR introduce a breaking change?

Not really, but it will provide users with more flexibility and control over the training process of their models.

What needs to be documented once your changes are merged?

I already modified the README.md file by adding a line at the end to describe the new parameter of the fit method.

Optimox commented 1 year ago

Thank your very much for your contribution! This looks good!

However I'd like the compute_importance=False behavior to be included somewhere in the notebooks CI.

Maybe you could add in this notebook https://github.com/dreamquark-ai/tabnet/blob/develop/census_example.ipynb in the cell starting with

# This illustrates the warm_start=False behaviour
save_history = []
for _ in range(2):
    clf.fit(..

a check where the fit is launched once with and once without compute_importance (starting with False then True to not change the behavior after this cell)

Also could you please squash your commit into one single commit named feat/493-optional-importance so that the commits refers to the issue #493.

I will also need to check that feature_importance is never called anywhere else in the code