ogunlao / saint

Unofficial Pytorch implementation of SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pretraining https://arxiv.org/abs/2106.01342
MIT License
23 stars 8 forks source link

Update saint_i.py #5

Open jackwilkie opened 1 year ago

jackwilkie commented 1 year ago

Inter sample attention now inherits from nn.MultiheadAttention which includes optimizations such as flash attention