harvardnlp / seq2seq-attn

Sequence-to-sequence model with LSTM encoder/decoders and attention
http://nlp.seas.harvard.edu/code
MIT License
1.26k stars 278 forks source link

error loading module 's2sa.models' ... ambiguous syntax (function call x new statement) #54

Closed ili3p closed 8 years ago

ili3p commented 8 years ago

I get this error when running the model:

 ...allations/torch/install/share/lua/5.1/trepl/init.lua:384: error loading module 's2sa.models' from file './s2sa/models.lua':
        ./s2sa/models.lua:85: ambiguous syntax (function call x new statement) near '('
stack traceback:
        [C]: in function 'error'
        ...allations/torch/install/share/lua/5.1/trepl/init.lua:384: in function 'require'
        train.lua:6: in main chunk
        [C]: in function 'dofile'
        .../torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:145: in main chunk
        [C]: ?

The line in question is:

   x = nn.JoinTable(2):usePrealloc("dec_inputfeed_join",
                                             {{opt.max_batch_l, opt.word_vec_size},{opt.max_batch_l, opt.rnn_size}})
                                ({x, inputs[1+offset]}) -- batch_size x (word_vec_size + rnn_size)

And the same issue is present in a couple of places in s2sa.models. The issue is usePrealloc returns self and then we call again a function on this returned self. Changing this to:

 x = nn.JoinTable(2):usePrealloc("dec_inputfeed_join",{{opt.max_batch_l, opt.word_vec_size},{opt.max_batch_l, opt.rnn_size}})
x = x({x, inputs[1+offset]}) -- batch_size x (word_vec_size + rnn_size)

creates other issues later on.

ili3p commented 8 years ago

The issue is only present in Lua 5.1 and it is there to prevent likely bugs from function calls that spread across lines, like in the case above. The fix is to remove the new line characters and make the function calls one line. http://lua.2524044.n2.nabble.com/Ambiguous-syntax-td7644848.html#d1355301287000-706

I have to say, I agree with Lua 5.1. The code in s2sa/models.lua is very hard to understand.

yoonkim commented 8 years ago

We plan on refactoring/cleaning up the code soon. Please bear with us in the meanwhile :)