Cyanogenoid / pytorch-vqa

Strong baseline for visual question answering
238 stars 97 forks source link

why attention use '+' instead of '*' #15

Closed sunqiang85 closed 6 years ago

sunqiang85 commented 6 years ago

in model.py line 114: x = self.relu(v + q) in the forward method of Class 'Attention', why not use '*' ?

Cyanogenoid commented 6 years ago

Because the paper that this code is trying to implement did it that way.