JonathanFly / bark

🚀 BARK INFINITY GUI CMD 🎶 Powered Up Bark Text-prompted Generative Audio Model
MIT License
994 stars 93 forks source link

GPU problems with CUDA detection, No module named 'encoded' and "No GPU being used. Careful" fixed #28

Closed Highpressure closed 1 year ago

Highpressure commented 1 year ago

1080ti not detected, while working with stable-diffusion and GPTchat

First encountered No module named 'encoded' Fixed with running: python -m pip install . (yes, include the .)

Second "No GPU being used. Careful, Inference might be extremely slow!" message

type python, hit enter then type import torch and hit enter now type torch.cuda.is_available() and see if it says true or false

if false go to pytorch and follow the steps for manual reinstall

if still not working while rest is working (like in my case) download anaconda or miniconda, create a clean environment and start over this did the trick finally for me, while I still had to take again the steps above.

Highpressure commented 1 year ago

hope that helps, please add an FAQ with this kind of stuff ;)