Closed Renaissance25 closed 3 years ago
My experiments are based on fairseq "1.0.0a0". The bug you met may be caused by rain/layers/multihead_attention_patched.py:line23. I modified MultiheadAttention to ensure attention ok while key is empty(This may occured during inference of CAAT). at line 266-272. Remove the decoration “with_incremental_state” may solve this problem. Thanks a lot for telling me this. I'll fix this code soon.
solved
There is error with fairseq-main code. Can you tell me what fairseq edition for your code? thanks