Closed kylebgorman closed 1 month ago
Oh no! Good catch, I guess I forgot to test all of this on GPU.
I just noticed that there is a torch tensor char
type, so I want to see if we can just use that.
I also noticed that I did this correctly in one case, but not in all...
On a different note, I don't think we actually need to convert them to chars at all, I think we can just loop over tensors and compare ints. This seems to work. I am going to push and then test on GPU.
Closed in #189.
I am getting errors when I try to compute SER on GPU. It looks like the prediction tensors need to be moved to CPU too before they're run through
numpy.char.mod
on lines 256, and 264 in addition to what's already done on line 245. (I don't know if they just need to be.cpu()
ed or also.numpy()
ed.)Assigning (very gently) to Adam in case I have misunderstood the problem.