spro / practical-pytorch

Go to https://github.com/pytorch/tutorials - this repo is deprecated and no longer maintained
MIT License
4.51k stars 1.1k forks source link

Altering MAX_LENGTH > 10 throws cuda runtime error (59) #112

Closed BlindElephants closed 5 years ago

BlindElephants commented 5 years ago

In the seq2seq example, all works as written. If I change the MAX_LENGTH variable to a larger value, I get the following error:

RuntimeError: cuda runtime error (59) : device-side assert triggered at /pytorch/aten/src/THC/generic/THCTensorCopy.c:21

And seems to be happening from the line:

if USE_CUDA: attn_energies = attn_energies.cuda()

Am I doing something stupid here or overlooking something?

BlindElephants commented 5 years ago

This is closed, although I'm still not entirely sure why / how this is happening, so any pointers in that direction would be helpful.