liltom-eth / llama2-webui

Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use `llama2-wrapper` as your local llama2 backend for Generative Agents/Apps.
MIT License
1.97k stars 202 forks source link

How to run on GPU? Runs on CPU only #68

Closed oaefou closed 1 year ago

oaefou commented 1 year ago

Win10 rtx3090

Dougie777 commented 1 year ago

I found it runs on linux much better. Try WSL. Just my 2 cents.