Closed TiagoMarinho closed 9 months ago
I'm having this exact same problem. Installed it several times over the last few days with no issues. (I was experimenting with different linux distros, got fed up with linux and switched back to win11) and all of a sudden today it stopped being able to load models on exllama, exllama2,(and the hf versions of both), autogptq, and autoawq....all no-gos with similar errors.
Edit: it seems that loading GGUFs is the only thing working
Same for me with Exllama and AutoAWQ it's impossibile to load models
I tried out some different builds and 06fff3b2e918f37e0c7d477d111a08d042e72968 seems to be the latest one that works if you wanna revert as a temporary fix I guess bumping to pytorch 11.8 in d33facc9feea19158420ddcbc842509825a4e88b is what broke it ¯\_(ツ)_/¯
I tried out some different builds and 06fff3b seems to be the latest one that works if you wanna revert as a temporary fix I guess bumping to pytorch 11.8 in d33facc is what broke it ¯_(ツ)_/¯
Thanks, I was hitting my face against a wall for almost 3 hours trying to figure out that mess, I was trying to upgrade from a very old webui version only to crash land myself with so many issues, even installing from scratch,..
Ended up cloning 06fff3b and everything works as it should, I'm gonna keep this version for now,,,
Hello, i have the same problem, i cant load Llama v1 AI LLM stuff but llamav2 works but it starts to lag a lot and when they want to answers back to me, the AI takes like 200-300 seconds, before it took like 20-50 seconds consistent, my system specs are intel core i5 8600K 2080 Gaming OC, 32 GB Ram win10 i use Mythalion-13B-GPTQ and Pygmalion-2-13B-GPTQ.
how do i do revert back to the old version until it gets fixed?, i have no idea how to do it! can somebody help out? thank you so much! I will upload 2 pictures, i didn't had this problem 2 weeks ago, i dont know what happened!
https://github.com/oobabooga/text-generation-webui/issues/4225#issuecomment-1759981218
how do i do revert back to the old version until it gets fixed?, i have no idea how to do it! can somebody help out? thank you
Goto the code page and find releases on the right side. Then click releases to view all past releases. 1.7 feels too recent so I didn't try that one. 1.6.1 worked flawlessly for me so far. (Previously tried 06fff3b, loads the model fine but errors on inference) Click download source code and use the typical bat file to install. Copy over any models, lora or other configuration files you have from the old version and never touch the update bat file until problem is fixed.
I tired the latest version of webui (c0ffb77) and it seems the issue has been fixed (at least for me)
I tried to install the new version too but i get a ->
but i have gradio installed but still says the same thing
I had to do a clean install to switch to the newest version so maybe try that? (Assuming what you did was try and run update.bat) I should also probably mention that I'm on Windows 10 so results may vary with other operating systems
I tired the latest version of webui (c0ffb77) and it seems the issue has been fixed (at least for me)
can confirm the problem does not happen after updating the webui, closing the issue If anyone else is still suffering from this, I think it's best to create a new issue, as the one I reported is now fixed
Describe the bug
A recent update has made it so that exllama does not work anymore when installing or migrating the webui from the old one-click installers.
Note that despite what the error says,
repositories/exllama
does exist.Is there an existing issue for this?
Reproduction
Screenshot
No response
Logs
System Info