NareshPS / doi-ml

ML Projects
0 stars 0 forks source link

Attention: Multi-level input processing. #14

Open NareshPS opened 1 year ago

NareshPS commented 1 year ago

Split input sequence into blocks. Each blocks is processed by two self-attentions. The blocks themselves are transformed into a second level sequence which use the two self-attentions. One self-attention is shared between the levels.

References