lucidrains / alphafold2

To eventually become an unofficial Pytorch implementation / replication of Alphafold2, as details of the architecture get released
MIT License
1.54k stars 256 forks source link

The definition of row-wise attention and col-wise attention #85

Closed CiaoHe closed 3 years ago

CiaoHe commented 3 years ago

Hi, Luci: Sorry, it's me again. Here, I was confused by the definition of row-wise and col-wise attention. https://github.com/lucidrains/alphafold2/blob/586792d738b9867d02b85a304c9b56091121b460/alphafold2_pytorch/alphafold2.py#L208-L209 https://github.com/lucidrains/alphafold2/blob/586792d738b9867d02b85a304c9b56091121b460/alphafold2_pytorch/alphafold2.py#L219-L220

Based on what I thought, the row w_x should be represented by (b h) w d, since once fetch one row, each row should have w(width) units.

So, maybe here need an inverse of the above definition?

lucidrains commented 3 years ago

@CiaoHe you are right, this is not clear https://github.com/lucidrains/alphafold2/releases/tag/0.4.13 fixed!