Closed chikiuso closed 4 years ago
Can you paste the detailed error log?
The model comfortably runs on an 11G GPU with around 64 batch size.
Thanks for your quick reply. It seems that it requires librosa installed, isn't it?
And may I ask how many G of Ram (not gpu) you have to run this project? thanks a lot!
Thanks for your quick reply. It seems that it requires librosa installed, isn't it?
no, at the moment, I don't think we have used Librosa. We plan to do so in the future.
And may I ask how many G of Ram (not gpu) you have to run this project? thanks a lot!
I am not sure about the exact numbers but the above statement that it can run on a single GPU is valid.
Hi @prajwalkr , sorry to keep asking questions ... I found the following error when running, thanks for your help! :
Using TensorFlow backend.
Traceback (most recent call last):
File "batch_inference.py", line 143, in
I fix it by simply comment the import librosa, thanks :D
Hi @prajwalkr , I tried to run it on 1080 ti with 11 G gpu ram, however the system halt every time I run on GPU, May I ask how many GPU ram is needed? Thanks.