lucidrains / x-transformers

A concise but complete full-attention transformer with a set of promising experimental features from various papers
MIT License
4.63k stars 395 forks source link

AlibiPositionalBias: slicing buffered bias #152

Closed antony-frolov closed 1 year ago

antony-frolov commented 1 year ago

https://github.com/lucidrains/x-transformers/blame/b4756498b307b0f173e329312f79f698315d595d/x_transformers/x_transformers.py#LL361C59-L361C59

Hi! Shouldn't we check here the dimensions for both i and j?

lucidrains commented 1 year ago

@antony-frolov hey Antony! you are right, it should have an extra check on the other dimension as well