Closed Dadatata-JZ closed 4 years ago
Hi,
It was tested fully but on an older version of PyTorch. The current one might have broken it.
I'm not certain which .py is giving you this error, but normally I've fixed that type of error by creating my dataloader as such: train = TensorDataset(torch.from_numpy(np.array(X, dtype=np.float32)), torch.from_numpy(np.array(Y, dtype=np.int64))) trainloader = torch.utils.data.DataLoader(train, batch_size=batch_size, shuffle=shuffle, drop_last=drop_last)
I am currently working on submitting my thesis (with a strict deadline), but I'll update the repository as soon as possible to fix this bug.
On which file(s) does this problem occur?
Thank you for your reply. I actually fixed it by changing the types. Good luck with your thesis.
It happened in evaluate_..._source....py
Hi there,
Was the PyTorch implementation tested fully? I kept getting complains about the tensor type for the loss function. Even I converted tensors to LONG as the error mentioned, the problem became LONG can not be taken by CUDA.
Thanks