Closed jiay7 closed 2 years ago
Yes this is a Numba level logging showing that the gpu is being underutilized (low occupancy). This is perfectly fine, since the kernels alpha beta kernels are designed to run at the batch size (and memory is a premium when training RNNTs).
If you somehow have memory, you can use batch sizes above 160 mentioned here to completely utilize the GPU
Yes, it seems that these warnings did not affect the training results. Thank you very much for your reply.
Thank you for the warprnnt_numba, I got the warnning (show blow) when I use this loss in my code. Is this known issue? How can it be debugged and solved?
Thank you!