pyg-team / pytorch-frame

Tabular Deep Learning Library for PyTorch
https://pytorch-frame.readthedocs.io
MIT License
478 stars 50 forks source link

Feature Importance #282

Open xnuohz opened 6 months ago

xnuohz commented 6 months ago

Feature

Support feature importance in tabular data scenarios.

  1. Understand which features are beneficial for prediction and help to develop new features
  2. Feature selection, removing features that are not helpful in prediction

Ideas

  1. GBDTs naturally have APIs for calculating feature importance, it's easy to add.
  2. NNs
    • Permutation. After shuffling a certain feature, observe the changes in metric. The greater the change, the more important the feature is. Simple.
    • SHAP. Complex.
yiweny commented 6 months ago

Mutual Information Sort is already added here. For feature sorting in NNs, I recommend you take a look at the ExcelFormer example. If you are interested in adding any feature related functionalities, you can add it in the transform module.

xnuohz commented 6 months ago

Thanks. As you mentioned, mutual information sorting and ExcelFormer improve performance through transformation capabilities. However, I want to discuss how much different features contribute to the final prediction result. For example, user behavioral features are important in recommender systems, so their feature importance should be high. pytorch-frame is good to use. It allows me to quickly obtain benchmark results on real-world datasets to determine whether NNs or GBDTs are better. I'm unsure if the functionality to evaluate feature importance is worth integrating as a module into pytorch-frame.

zechengz commented 6 months ago

I think you can use Captum https://captum.ai/ to have a try cc @weihua916 we can also integrate this in PyT?

xnuohz commented 6 months ago

Yes, Captum implemented many interpretability methods, Feature Permutation and SHAP are part of them.

February24-Lee commented 2 months ago

Is there any update or roadmap related to it? 👀