titu1994 / warprnnt_numba

WarpRNNT loss ported in Numba CPU/CUDA for Pytorch
MIT License
16 stars 2 forks source link

GPU under utilization due to low occupancy. #3

Closed jiay7 closed 2 years ago

jiay7 commented 2 years ago

Thank you for the warprnnt_numba, I got the warnning (show blow) when I use this loss in my code. 1650880807(1) Is this known issue? How can it be debugged and solved?

Thank you!

titu1994 commented 2 years ago

Yes this is a Numba level logging showing that the gpu is being underutilized (low occupancy). This is perfectly fine, since the kernels alpha beta kernels are designed to run at the batch size (and memory is a premium when training RNNTs).

If you somehow have memory, you can use batch sizes above 160 mentioned here to completely utilize the GPU

jiay7 commented 2 years ago

Yes, it seems that these warnings did not affect the training results. Thank you very much for your reply.