Element-Research / rnn

Recurrent Neural Network library for Torch7's nn
BSD 3-Clause "New" or "Revised" License
938 stars 314 forks source link

ParallelTable with Sequencer does not seem to work #431

Open vguptai opened 6 years ago

vguptai commented 6 years ago

Hi,

I am trying ParallelTable with Sequencer but not able to get desired result. I understand that I am not passing the input to forward in the correct way. How should I pass this data?

One way is to stack the input data together side by side before passing to the model and then split it inside the model. But I am not able to find the module which can do that. Basically, I would need to pass a tensor of size (50,16,1027) to the sequencer and inside the sequencer have a module which can split it back again.

Here is the code:

local a = torch.randn(50,16,1024)
local b = torch.randn(50,16,3)
local m1 = nn.Sequential()
local parallel_table = nn.ParallelTable()
parallel_table:add(nn.Linear(1024, 1024))
parallel_table:add(nn.Linear(3, 20))
m1:add(parallel_table)
m1 = nn.Sequencer(m1)
m1:forward({a,b})

Following is the error that I get:

In 1 module of nn.Sequential:
In 2 module of nn.ParallelTable:
/usr/local/torch/install/share/lua/5.1/nn/Linear.lua:66: size mismatch, m1: [16 x 1024], m2: [3 x 20] at /usr/local/torch/pkg/torch/lib/TH/generic/THTensorMath.c:1293
tastyminerals commented 6 years ago

You are feeding tensors of incompatible sizes. ParallelTable does not do resizing for you.

vguptai commented 6 years ago

@tastyminerals Thanks for your response. Which dimension/tensor are wrong in size? Which dimension should I change to avoid the crash?