Hello, while I want to try implement your model. I found that "ScaledDotProductAttentionMemory" have been referenced on 0 files. Did you really use the memory slots that has been mentioned on paper? or I get it wrong and miss something out. Thank you :D
Hello, while I want to try implement your model. I found that "ScaledDotProductAttentionMemory" have been referenced on 0 files. Did you really use the memory slots that has been mentioned on paper? or I get it wrong and miss something out. Thank you :D
https://github.com/aimagelab/meshed-memory-transformer/blob/e0fe3fae68091970407e82e5b907cbc423f25df2/models/transformer/attention.py#L69