kefirski / pytorch_RVAE

Recurrent Variational Autoencoder that generates sequential data implemented with pytorch
MIT License
357 stars 87 forks source link

RuntimeError: bool value of Variable objects containing non-empty torch.FloatTensor is ambiguous #5

Open ghost opened 7 years ago

ghost commented 7 years ago

Hi, have you got any idea why I'm getting this error?

(py35_pytorch) ajay@ajay-h8-1170uk:~/PythonProjects/pytorch_RVAE-master$ python3 train_word_embeddings.py
preprocessed data was found and loaded
Traceback (most recent call last):
  File "train_word_embeddings.py", line 47, in <module>
    out = neg_loss(input, target, args.num_sample).mean()
  File "/home/ajay/anaconda3/envs/py35_pytorch/lib/python3.5/site-packages/torch/nn/modules/module.py", line 224, in __call__
    result = self.forward(*input, **kwargs)
  File "/home/ajay/PythonProjects/pytorch_RVAE-master/selfModules/neg.py", line 38, in forward
    assert parameters_allocation_check(self), \
  File "/home/ajay/PythonProjects/pytorch_RVAE-master/utils/functional.py", line 16, in parameters_allocation_check
    return fold(f_and, parameters, True) or not fold(f_or, parameters, False)
  File "/home/ajay/PythonProjects/pytorch_RVAE-master/utils/functional.py", line 2, in fold
    return a if (len(l) == 0) else fold(f, l[1:], f(a, l[0]))
  File "/home/ajay/PythonProjects/pytorch_RVAE-master/utils/functional.py", line 2, in fold
    return a if (len(l) == 0) else fold(f, l[1:], f(a, l[0]))
  File "/home/ajay/PythonProjects/pytorch_RVAE-master/utils/functional.py", line 6, in f_and
    z = x and y
  File "/home/ajay/anaconda3/envs/py35_pytorch/lib/python3.5/site-packages/torch/autograd/variable.py", line 123, in __bool__
    torch.typename(self.data) + " is ambiguous")
RuntimeError: bool value of Variable objects containing non-empty torch.FloatTensor is ambiguous

So I guess there's a type problem - a python bool needs to be converted to a torch float?

aykutfirat commented 7 years ago

comment out 'assert parameters_allocation_check' in neg.py as a workaround.

ghost commented 7 years ago

@aykutfirat Thanks a lot

DuaneNielsen commented 6 years ago

Hit the same problem.. suggest just delete it?