kefirski / pytorch_RVAE

Recurrent Variational Autoencoder that generates sequential data implemented with pytorch
MIT License
357 stars 87 forks source link

Error: bool value of Tensor with more than one value is ambiguous #16

Open sarvesh710 opened 5 years ago

sarvesh710 commented 5 years ago

File "/home/sarvesh23/pytorch_RVAE/utils/functional.py", line 6, in f_and return x and y RuntimeError: bool value of Tensor with more than one value is ambiguous

I am running train_word_embeddings.py. Any hint what I am doing wrong ?

davislf2 commented 5 years ago

Same problem here. Does anyone know why?

SHIVITG commented 5 years ago

preprocessed data was found and loaded Traceback (most recent call last): File "train_word_embeddings.py", line 50, in out = neg_loss( input, target, args.num_sample).mean() File "/home/ec2-user/anaconda3/envs/pytorch_p36/lib/python3.6/site-packages/torch/nn/modules/module.py", line 493, in call result = self.forward(*input, **kwargs) File "/home/ec2-user/SageMaker/pytorch_RVAE/selfModules/neg.py", line 38, in forward assert parameters_allocation_check(self), \ File "/home/ec2-user/SageMaker/pytorch_RVAE/utils/functional.py", line 15, in parameters_allocation_check return fold(f_and, parameters, True) or not fold(f_or, parameters, False) File "/home/ec2-user/SageMaker/pytorch_RVAE/utils/functional.py", line 2, in fold return a if (len(l) == 0) else fold(f, l[1:], f(a, l[0])) File "/home/ec2-user/SageMaker/pytorch_RVAE/utils/functional.py", line 2, in fold return a if (len(l) == 0) else fold(f, l[1:], f(a, l[0])) File "/home/ec2-user/SageMaker/pytorch_RVAE/utils/functional.py", line 6, in f_and return x and y RuntimeError: bool value of Tensor with more than one value is ambiguous

leehaoyuan commented 5 years ago

According to the note, if you delete following lines in selfModules/neg.py, it will work just fine. assert parameters_allocation_check(self), \ """ Invalid CUDA options. out_embed and in_embed parameters both should be stored in the same memory got out_embed.is_cuda = {}, in_embed.is_cuda = {} """.format(self.out_embed.weight.is_cuda, self.in_embed.weight.is_cuda)

kay312 commented 4 years ago

According to the note, if you delete following lines in selfModules/neg.py, it will work just fine. assert parameters_allocation_check(self), """ Invalid CUDA options. out_embed and in_embed parameters both should be stored in the same memory got out_embed.is_cuda = {}, in_embed.is_cuda = {} """.format(self.out_embed.weight.is_cuda, self.in_embed.weight.is_cuda)

yeah, the code is to do 'parameters_allocation_check', i deleted them and it worked, but i dont know whether it influence the output or not.

gohjiayi commented 3 years ago

According to the note, if you delete following lines in selfModules/neg.py, it will work just fine. assert parameters_allocation_check(self), """ Invalid CUDA options. out_embed and in_embed parameters both should be stored in the same memory got out_embed.is_cuda = {}, in_embed.is_cuda = {} """.format(self.out_embed.weight.is_cuda, self.in_embed.weight.is_cuda)

Currently working on Python 3.6.9 and facing the same issue. After removing the parameters_allocation_check code (quoted above), I faced additional errors and this is how I solved them. (P.S. line number might differ)

ValueError: 'Object arrays cannot be loaded when allow_pickle=False' In batch_loader.py, line 221. For np.load add in the argument allow_pickle=True, as instructed by StackOverflow post here.

[self.word_tensor, self.character_tensor] = [np.array([np.load(target, allow_pickle=True) for target in input_type])
                                            for input_type in tensor_files]

IndexError: too many indices for array: array is 0-dimensional, but 1 were indexed In train_word_embeddings.py, line 56. For out.cpu().data.numpy()[0] remove the index [0].

if iteration % 500 == 0:
      out = out.cpu().data.numpy()
      print('iteration = {}, loss = {}'.format(iteration, out))

This allowed me to run the codes and build a custom word embedding successfully. Still studying the impact it has on the word embeddings so please use at your own discretion. Hope this helps the others!