ogunlao / saint

Unofficial Pytorch implementation of SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pretraining https://arxiv.org/abs/2106.01342
MIT License
23 stars 8 forks source link

Update saint_i.py #4

Closed jackwilkie closed 1 year ago

jackwilkie commented 1 year ago

intersample attention is now wrapper for nn.MultiheadAttention to leverage flash attention and efficency improvements