SeanNaren / warp-ctc

Pytorch Bindings for warp-ctc
Apache License 2.0
756 stars 271 forks source link

Can not run on gpu #62

Open Angel-Jia opened 6 years ago

Angel-Jia commented 6 years ago

I have some variable saved from crnn-pytorch, which runs on gpu and uses warp-ctc. Then I use those variables in my code, I can run it on cpu and the results are correct. But when I run it on gpu, I got Segmentation fault error. This is my code which runs on gpu:

import torch
from warpctc_pytorch import CTCLoss
import numpy as np
from torch.autograd import Variable

i = 1
criterion = CTCLoss()
criterion = criterion.cuda()
a = np.load('preds_%d.npy' % i)
b = np.load('text_%d.npy' % i)
c = np.load('preds_size_%d.npy' % i)
d = np.load('length_%d.npy' % i)

#a = Variable(torch.from_numpy(a)).cuda()
#b = Variable(torch.from_numpy(b)).cuda()
#c = Variable(torch.from_numpy(c)).cuda()
#d = Variable(torch.from_numpy(d)).cuda()

a = torch.from_numpy(a).cuda()
b = torch.from_numpy(b).cuda()
c = torch.from_numpy(c).cuda()
d = torch.from_numpy(d).cuda()

print(a.dtype)
print(b.dtype)
print(c.dtype)
print(d.dtype)

cost = criterion(a, b, c, d) / 64
print('cost:', cost)

I got Segmentation fault. When delete all .cuda(), I got correct answer. All cpu and gpu test has passed. I really hope someone can help.

Angel-Jia commented 6 years ago

It turns out that b,c,d can not be CudaTensor, or will get Segmentation fault. To calculate the loss, you should write like this:

a = torch.from_numpy(a).cuda()
b = torch.from_numpy(b)
c = torch.from_numpy(c)
d = torch.from_numpy(d)

I suggest the author should put the usage in README.md. It costs me two days to find the problem. @SeanNaren

guilk commented 6 years ago

@Mabinogiysk Hi, did the current version of warp-ctc work well for your project? I use the crnn.pytorch pipeline (https://github.com/meijieru/crnn.pytorch) but the training loss does not decrease.

Angel-Jia commented 6 years ago

My warp_ctc works fine.

Arkanayan commented 6 years ago

Thank you, @Mabinogiysk. You saved me a lot of trouble.