timbmg / Sentence-VAE

PyTorch Re-Implementation of "Generating Sentences from a Continuous Space" by Bowman et al 2015 https://arxiv.org/abs/1511.06349
585 stars 151 forks source link

Zero-dimensional tensor #4

Closed gsramsu closed 6 years ago

gsramsu commented 6 years ago

Hi,

When I run python train.py, I get the error at line 71: zero-dimensional tensor (at position 1) cannot be concatenated. Could you please help me debug this?

Regards Ramsu

timbmg commented 6 years ago

Hi Ramsu,

could you please post the entire trace including the parameters you are running train.py with? Also which python and pytorch version are you using?

Best, Tim

gsramsu commented 6 years ago

I am using python 3.6 and pytorch 0.4.

First, this happened:

SentenceVAE( (embedding): Embedding(9877, 300) (word_dropout): Dropout(p=0.5) (encoder_rnn): GRU(300, 256, batch_first=True) (decoder_rnn): GRU(300, 256, batch_first=True) (hidden2mean): Linear(in_features=256, out_features=16, bias=True) (hidden2logv): Linear(in_features=256, out_features=16, bias=True) (latent2hidden): Linear(in_features=16, out_features=256, bias=True) (outputs2vocab): Linear(in_features=256, out_features=9877, bias=True) ) tensor([], device='cuda:0') tensor(187.1793, device='cuda:0') Traceback (most recent call last): File "train.py", line 213, in main(args) File "train.py", line 136, in main tracker['ELBO'] = torch.cat((tracker['ELBO'], loss)) RuntimeError: zero-dimensional tensor (at position 1) cannot be concatenated


So I commented out the tracker['ELBO'] references, and got an out of memory error.

timbmg commented 6 years ago

I have not upgraded this repo to pytorch 0.4 yet. So this might be the problem. I tested it with pytorch 0.3.

buaahsh commented 6 years ago

@gsramsu You can reshape the loss.data, like tracker['ELBO'] = torch.cat((tracker['ELBO'], loss.data.view(1))), which works for me.