yandex-research / rtdl

Research on Tabular Deep Learning: Papers & Packages
Apache License 2.0
888 stars 98 forks source link

How to get feature importance scores or attention heatmap #56

Closed Yuntian9708 closed 1 year ago

Yuntian9708 commented 1 year ago

Hi, I am trying to get some visualizations of interpretable results from FT-Transformer, such as feature importance or attention heatmaps. I find some discussions about feature importance in paper section 5.3, but I don't know how to achieve it. Is there a way to achieve these based on the source codes you published? Or can you make an example of implementation? Thank you very much!

Yura52 commented 1 year ago

Please, see this issue

P.S. The implementation of the paper is now located here: https://github.com/yandex-research/tabular-dl-revisiting-models

Yura52 commented 1 year ago

If you need further help, feel free to create a new issue in that repository or continue the discussion in the issue I mentioned