smafjal / Bengali-Machine-Translation-seq2seq-with-attention

Bengali Machine Translation by using encoder-decoder with attention model
7 stars 5 forks source link

RuntimeError: 1D tensors expected, got 2D #1

Open paul-pias opened 4 years ago

paul-pias commented 4 years ago

Traceback (most recent call last): File "train.py", line 139, in main() File "train.py", line 116, in main epoch_loss = train(input_variable, target_variable, encoder, decoder, encoder_optimizer, decoder_optimizer, criterion) File "train.py", line 59, in train encoder_outputs) File "/home/nybsys/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 532, in call result = self.forward(*input, *kwargs) File "/home/nybsys/Desktop/seq2seq/models.py", line 79, in forward attention_weights = self.attention(rnn_output.squeeze(0), encoder_outputs) File "/home/nybsys/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 532, in call result = self.forward(input, **kwargs) File "/home/nybsys/Desktop/seq2seq/models.py", line 122, in forward energies[i] = self._score(hidden, encoder_outputs[i]) File "/home/nybsys/Desktop/seq2seq/models.py", line 131, in _score energy = hidden.dot(encoder_output) RuntimeError: 1D tensors expected, got 2D, 2D tensors at /pytorch/aten/src/TH/generic/THTensorEvenMoreMath.cpp:733

paul-pias commented 4 years ago

Did you face this issue? Can you help me with how to solve it?

Tarequzzaman commented 3 years ago

Same here

python train.py     

Encoder-Model:  EncoderRNN(
  (embedding): Embedding(68, 500)
  (gru): GRU(500, 500, num_layers=4)
)
Decoder-Model:  AttentionDecoderRNN(
  (embedding): Embedding(67, 500)
  (gru): GRU(1000, 500, num_layers=4, dropout=0.05)
  (out): Linear(in_features=1000, out_features=67, bias=True)
  (attention): Attention()
)
Traceback (most recent call last):
  File "train.py", line 139, in <module>
    main()
  File "train.py", line 116, in main
    epoch_loss = train(input_variable, target_variable, encoder, decoder, encoder_optimizer, decoder_optimizer, criterion)
  File "train.py", line 59, in train
    encoder_outputs)
  File "/Users/tarequzzamankhan/Bengali-Machine-Translation-seq2seq-with-attention/env/lib/python3.7/site-packages/torch/nn/modules/module.py", line 477, in __call__
    result = self.forward(*input, **kwargs)
  File "/Users/tarequzzamankhan/Bengali-Machine-Translation-seq2seq-with-attention/models.py", line 79, in forward
    attention_weights = self.attention(rnn_output.squeeze(0), encoder_outputs)
  File "/Users/tarequzzamankhan/Bengali-Machine-Translation-seq2seq-with-attention/env/lib/python3.7/site-packages/torch/nn/modules/module.py", line 477, in __call__
    result = self.forward(*input, **kwargs)
  File "/Users/tarequzzamankhan/Bengali-Machine-Translation-seq2seq-with-attention/models.py", line 122, in forward
    energies[i] = self._score(hidden, encoder_outputs[i])
  File "/Users/tarequzzamankhan/Bengali-Machine-Translation-seq2seq-with-attention/models.py", line 131, in _score
    energy = hidden.dot(encoder_output)
RuntimeError: dot: Expected 1-D argument self, but got 2-D
Tarequzzaman commented 3 years ago

@ paul-pias how did you solve this ??

DanielTobi0 commented 1 month ago

@Tarequzzaman @paul-pias @smafjal same issue, were you able to resolve the tensors mis-match error?