LostRuins / koboldcpp

Run GGUF models easily with a KoboldAI UI. One File. Zero Install.
https://github.com/lostruins/koboldcpp
GNU Affero General Public License v3.0
5.53k stars 372 forks source link

Gguf llm support gone? #1112

Closed KintCark closed 3 months ago

KintCark commented 3 months ago

How do we use local llm gguf models now I don't see anywhere a option for it. I liked the old version better....×_×

UPDATE: I saw a back button, I pressed it an boom!, I'm back to the older version:>

LostRuins commented 3 months ago

GGUF files are always supported. What do you mean you don't see an option for it?

KintCark commented 3 months ago

GGUF files are always supported. What do you mean you don't see an option for it?

I found it I had to press the back button at bottom right which took me to the llm. I was in stableui

LostRuins commented 3 months ago

You can always relaunch the UI with http://localhost:5001 in your web browser