oobabooga / text-generation-webui

A Gradio web UI for Large Language Models.
GNU Affero General Public License v3.0
40.87k stars 5.34k forks source link

oobabooga amd gpu windows #6525

Closed F1re4 closed 2 weeks ago

F1re4 commented 2 weeks ago

I've searched the entire Internet, I can't find anything; it's been a long time since the release of oobabooga. There are ways to run it on an AMD GPU RX6700XT on Windows without Linux and virtual environments. When I try, it just says that PyTorch is not fully installed, but nothing helps, tell me Please

RSAStudioGames commented 2 weeks ago

Do you have ROCM installed? If I remember correctly, ROCM only works on baremetal linux. Not even WSL unfortunately. https://rocm.docs.amd.com/projects/install-on-linux/en/latest/how-to/native-install/ubuntu.html

AMD's docs has all the install info you need.

F1re4 commented 2 weeks ago

so u saying its only possible with linux main os, on windows it is not possbible with Windows?

F1re4 commented 2 weeks ago

What is the best alternative for AMD GPU, local, many models, very simillar to Oobabooga but for AMD GPU?

RSAStudioGames commented 2 weeks ago

So I just looked it up, seems like there is a HIP SDK from ROCM for windows 10/11. I haven't tried it so I can't guarantee anything, but that could be a possible option to get Oobabooga working. https://www.amd.com/en/developer/resources/rocm-hub/hip-sdk.html

An alternative to Oobabooga would be: https://github.com/YellowRoseCx/koboldcpp-rocm I used this with my 7900XT at one point on windows. I believe it only runs GGUF/GGML models though. That shouldn't be an issue though as a lot of people convert models to GGUF and it also lets you run models using your RAM and VRAM combined.

Hope that helps. :)

F1re4 commented 2 weeks ago

thx a lot u rly helped me