aimagelab / meshed-memory-transformer

Meshed-Memory Transformer for Image Captioning. CVPR 2020
BSD 3-Clause "New" or "Revised" License
517 stars 136 forks source link

Did this project use memory slots? #50

Closed rakkaalhazimi closed 3 years ago

rakkaalhazimi commented 3 years ago

Hello, while I want to try implement your model. I found that "ScaledDotProductAttentionMemory" have been referenced on 0 files. Did you really use the memory slots that has been mentioned on paper? or I get it wrong and miss something out. Thank you :D

https://github.com/aimagelab/meshed-memory-transformer/blob/e0fe3fae68091970407e82e5b907cbc423f25df2/models/transformer/attention.py#L69

marcellacornia commented 3 years ago

@rakkaalhazimi the ScaledDotProductAttentionMemory is used in the encoder of our model.