Closed Deepayan137 closed 6 years ago
Hi Tim, I guess you forgot to add .cuda to model at line no. 40, This is causing TypeError while trying to execute the inference.py code.
.cuda
inference.py
use_gpu = torch.cuda.is_available() if use_gpu: model=model.cuda() model.eval()
I guess the above snippet will fix it. :)
Great, thank you! I added it 👍
Hi Tim, I guess you forgot to add
.cuda
to model at line no. 40, This is causing TypeError while trying to execute theinference.py
code.I guess the above snippet will fix it. :)