AlizerUncaged / desktop-waifu

Desktop Waifu!
334 stars 49 forks source link

How to use GPU instead of CPU for faster translating? #223

Open carlangassnake opened 9 months ago

carlangassnake commented 9 months ago

I already installed CUDA and TorchPy, but dont know what I have to set and where I have to set it so the workload of the transcription goes to my GPU (RTX2060) instead of the CPU, cause Ive read that way is faster

Im using Elevenlabs, btw

Lacking-some-wut-of-worth commented 9 months ago

idk bro i havent reached that point yet. then why am i commenting , idk im bored

MepleYui commented 7 months ago

Made a video about it on 8:33, i hope this helps!