interpretml / interpret

Fit interpretable models. Explain blackbox machine learning.
https://interpret.ml/docs
MIT License
6.14k stars 724 forks source link

Feature importance using Permutation #26

Open npatta01 opened 5 years ago

npatta01 commented 5 years ago

Hi, At work, we use the "Permutation Importance" method to inspect feature importance. We use the awesome library eli5 for that.

Would it be possible to include a version of that in this library?

Sandy4321 commented 5 years ago

good idea, eli5 is great

interpret-ml commented 5 years ago

Thanks for bringing this up! Permutation Importance is definitely a method we've been thinking about. The main challenge is finding the right way to cleanly integrate it into our API. We understand the demand for this feature though, so we'll keep this issue open until we make progress on this.

imatiach-msft commented 4 years ago

@npatta01 @Sandy4321 permutation feature importance is available in the interpret-community package here, which is an extension to interpret: https://github.com/interpretml/interpret-community/blob/master/python/interpret_community/permutation/permutation_importance.py