torch / nn

Other
1.34k stars 966 forks source link

Error in makeInputContiguous() function of LookupTable! #1182

Closed jKhurana closed 7 years ago

jKhurana commented 7 years ago

I am creating an encoder. Below is my code:

require 'nn'

-- basic lstm cell
-- input; x, previsou_state
function lstm(x, prev_c, prev_h)
  -- Calculate all four gates in one go
  local i2h = nn.Linear(opt.rnn_size, 4 * opt.rnn_size)(x)
  local h2h = nn.Linear(opt.rnn_size, 4 * opt.rnn_size)(prev_h)
  local gates = nn.CAddTable()({i2h, h2h})

  -- Reshape to (batch_size, n_gates, hid_size)
  -- Then slize the n_gates dimension
  local reshaped_gates =  nn.Reshape(4, opt.rnn_size)(gates)
  local sliced_gates = nn.SplitTable(2)(reshaped_gates)

  -- Use select gate to fetch each gate and apply nonlinearity
  local in_gate          = nn.Sigmoid()(nn.SelectTable(1)(sliced_gates)):annotate{name='in_gate'}
  local in_transform     = nn.Tanh()(nn.SelectTable(2)(sliced_gates)):annotate{name='in_transform'}
  local forget_gate      = nn.Sigmoid()(nn.SelectTable(3)(sliced_gates)):annotate{name='forget_gate'}
  local out_gate         = nn.Sigmoid()(nn.SelectTable(4)(sliced_gates)):annotate{name='out_gate'}

  local next_c           = nn.CAddTable()({
      nn.CMulTable()({forget_gate, prev_c}):annotate{name='next_c_1'},
      nn.CMulTable()({in_gate, in_transform}):annotate{name='next_c_2'}
  })
  local next_h           = nn.CMulTable()({out_gate, nn.Tanh()(next_c)}):annotate{name='next_h'}

  return next_c, next_h
end

function create_singlelayer_lookup_encoder(w_size,opt)
  -- input nngraph nodes
  x = nn.Identity()()
  prev_s = nn.Identity()()

  -- make lookup table
  local x_in = nn.LookupTable(w_size,opt.rnn_size)(x):annotate{name="enc_lookup"}
  local next_s = {}
  local splitted = {prev_s:split(2)}
  local next_c,next_h = lstm(x_in,splitted[1],splitted[2])
  --print(type(next_h))
  table.insert(next_s,next_c)
  table.insert(next_s,next_h)

  local m = nn.gModule({x, prev_s}, {nn.Identity()(next_s)})
  return m
end

local cmd = torch.CmdLine()
cmd:option('-rnn_size', 150, 'size of LSTM internal state')
opt = cmd:parse(arg)

model = create_singlelayer_lookup_encoder(10,opt)

When I run the above code it gives the following error: lua: .../jkhurana/torch/install/share/lua/5.2/nn/LookupTable.lua:59: attempt to index local 'input' (a nil value)

StackTrace:

stack traceback:
    .../jkhurana/torch/install/share/lua/5.2/nn/LookupTable.lua:59: in function 'makeInputContiguous'
    .../jkhurana/torch/install/share/lua/5.2/nn/LookupTable.lua:71: in function <.../jkhurana/torch/install/share/lua/5.2/nn/LookupTable.lua:68>
    (...tail calls...)
    .../home/jkhurana/torch/install/share/lua/5.2/nn/Module.lua:339: in function <.../home/jkhurana/torch/install/share/lua/5.2/nn/Module.lua:338>
    [C]: in ?
    encoder.lua:42: in function 'create_singlelayer_lookup_encoder'
    encoder.lua:58: in main chunk
    [C]: in ?
pavanky commented 7 years ago

You are not passing anything to nn.identity()() so x ends up being nil. The error message is pretty straight forward..

jKhurana commented 7 years ago

I am only creating model here, not doing training. Once I will call forward() method on module, I will pass the input to the module. Is it not similar to the "A network with containers" section of the following link: https://github.com/torch/nngraph

pavanky commented 7 years ago

Hmm, shouldn't you be requiring nngraph instead of nn ?

jKhurana commented 7 years ago

I didn't understand your last comment. I am new to torch. I just create the graph and convert it into module by gModule() method. I am following the above link. Is there anything that I am missing here ?

pavanky commented 7 years ago

Try replacing

require 'nn'

with

require 'nngraph'