Closed avs20 closed 8 years ago
I solved the problem. It was my fault of less knowledge and not trying enough. I passed my data in the form of 4x1x128x196 where each of 4 till go in to the parallel and then 1 is the seq length, 128 the batch size 196 the input size.
I am still at the problem now.
I declare the modules as
seqLSTM = nn.SeqLSTM(128,128)
but when I whenever I Pass a tensor of size 128. It throws error of
Only Support batch mode
Then I try to pass the input by reshaping it with nn.Reshape as
RNNModel = nn.Sequential()
RNNModel:add(nn.Reshape(1,1,128))
seqLSTM = nn.SeqLSTM(128,128)
but it is still throwing the same error of only supports batch mode.
The full code to reproduce it is
mData = torch.randn(2,128)
print(mData:size())
require 'rnn'
require 'torch'
mData = torch.randn(2,128)
print(mData:size())
RNNModel = nn.Sequential()
RNNModel:add(nn.Reshape(2,1,128))
rnns = nn.Parallel(1,1)
seqLSTM = nn.SeqLSTM(128,128)
seqLSTM.batchfirst=true
rnns:add(seqLSTM)
rnns:add(seqLSTM)
RNNModel:add(rnns)
print("RNN model done")
mdl = nn.Sequential()
mdl:add(RNNModel)
y = mdl:forward(mData)
print(y:size())
thanks
Parallel transforms the input to 1x128 instead of 1x1x128 a simple solution would be to input 2x1x1x128.
@JoostvDoorn This fixes it : RNNModel:add(nn.Reshape(2,1,1,128))
. The issue is that, with the former RNNModel:add(nn.Reshape(2,1,128))
the output of the Parallel is two tensors of size 1 x 28
.
@JoostvDoorn @nicholas-leonard Yeah it works now. Thanks. I should have caught it myself.
Hi all,
I have 128x14x14 input and I am trying to feed it into the 4 parallel seqLSTM modules for processing.
The code is like this
I understand that the 196 passed above is only the input and the output size. It was throwing error when I passed it 4x128x196 the error message came to be
So how to pass the data in batch mode to seqLSTM or am I doing something fundamentally wrong?