somepago / saint

The official PyTorch implementation of recent paper - SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training
Apache License 2.0
402 stars 63 forks source link

Plotting attention for explainability #16

Open isamgul opened 2 years ago

isamgul commented 2 years ago

Hello Gowthami,

Thank you for this project. It shows uplift in performance for my use-case over xgboost. It will be of great help to get the attention plotting code (both self attention and inter-sample attention)for the SAINT implementation as shown by you in the paper for SAINT.