ml-jku / DeepRC

DeepRC: Immune repertoire classification with attention-based deep massive multiple instance learning
112 stars 24 forks source link

Where is the self-attention operation #5

Closed fteufel closed 2 years ago

fteufel commented 2 years ago

Hi,

I was just going through the code in architectures.py and the paper side-by-side. I can't seem to find the query*key operation in the code. As I understand it, this should happen in AttentionNetwork. From what I see, this is "attention SNN" from Figure 2 in the paper, followed by a linear layer that computes attention weights straight from the keys?

Please let me know if I misunderstood something here, from the paper I assumed that a query*key operation must be performed there.

fteufel commented 2 years ago

Nevermind, I missed that the query vector was a learnable weight ...