Closed cominger closed 8 years ago
I have the same issue with my version of torch. The problem comes from nn.maskSoftMax at line 105 of attention.lua
I am currently investigating...
I may have fixed it: https://github.com/jiasenlu/HieCoAttenVQA/pull/21/files
I am currently training the model. Let's see :)
Thanks @Cadene ! I can also training too. I guess it was batch size problem.
I also follow the step of READMe, and I got this error. It looks similar with the closed issue, but I'm using the most recent version of torch. Can someone help me how to solve this problems? By the way this is train.lua step ---------------------------------------error-------------------------------- ~ constructing clones inside the ques_level total number of parameters in recursive_attention: 2862056
/home/user/torch/install/bin/luajit: /home/user/torch/install/share/lua/5.1/nn/THNN.lua:110: input and gradOutput have different number of elements: input[20 x 26] has 520 elements, while gradOutput[26] has 26 elements at /home/user/torch/extra/cunn/lib/THCUNN/generic/SoftMax.cu:84 stack traceback: [C]: in function 'v' /home/user/torch/install/share/lua/5.1/nn/THNN.lua:110: in function 'SoftMax_updateGradInput' ./misc/maskSoftmax.lua:33: in function 'updateGradInput' .../user/torch/install/share/lua/5.1/nngraph/gmodule.lua:420: in function 'neteval' .../user/torch/install/share/lua/5.1/nngraph/gmodule.lua:454: in function 'updateGradInput' /home/user/torch/install/share/lua/5.1/nn/Module.lua:31: in function 'backward' ./misc/ques_level.lua:143: in function 'updateGradInput' /home/user/torch/install/share/lua/5.1/nn/Module.lua:31: in function 'backward' train.lua:274: in function 'lossFun' train.lua:313: in main chunk [C]: in function 'dofile' ...usr/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:145: in main chunk [C]: at 0x00406670