Closed hadaev8 closed 4 years ago
Aha, found it RecurrentAttentionLayer have no memory_lengths argument https://github.com/idiap/fast-transformers/blob/ea9cb3b1751f0b2f1e661b087f3dd3ec8a413ab0/fast_transformers/recurrent/attention/self_attention/attention_layer.py#L54 But RecurrentTransformerDecoderLayer try to pass it https://github.com/idiap/fast-transformers/blob/8acb570071926605da1d7f22d4c1239be2d80b55/fast_transformers/recurrent/transformers.py#L215
Nevermind, didnt realize where is RecurrentCrossAttentionLayer also. Kind of complicated honestly.
Sorry for disturbing, I can't understand is it me or error in lib. I'm doing sampling like this:
Whole code and trace https://colab.research.google.com/drive/1mYTh4MO_Tg6LBrhhVQUd81R92UNE56F7?usp=sharing