Open alexanderwerning opened 10 months ago
Hello, I am trying to setup the LTU-AS system for local inference. I got an error because I only have one GPU, is there a reason why whisper-at is moved to cuda:1 and not cuda:0? https://github.com/YuanGongND/ltu/blob/8f615aa226f1367e200ef6ab90cebdd5549a791f/src/ltu_as/inference_gradio.py#L35 I changed it to cuda:0 and it seems to work fine
Thanks!
hi there,
thanks so much for pointing this out.
My bad, will fix this soon.
-Yuan
Hello, I am trying to setup the LTU-AS system for local inference. I got an error because I only have one GPU, is there a reason why whisper-at is moved to cuda:1 and not cuda:0? https://github.com/YuanGongND/ltu/blob/8f615aa226f1367e200ef6ab90cebdd5549a791f/src/ltu_as/inference_gradio.py#L35 I changed it to cuda:0 and it seems to work fine
Thanks!