Open Jake36921 opened 1 month ago
Trying to install it the same way, no luck so far <3
The LMStudio config works against both koboldcpp and textgenwebui if you just change the url port to :5001(koboldcpp) or 5000(TGWUI)
I'm using it right now with koboldcpp, The only real difference is that koboldcpp uses "max_length" instead of "max_tokens"
It would be great to get a little love for open source solutions.
Koboldcpp now hosts a whisper endpoint as well.
Ollama is different in that it nests parameters.
My recommendation to the devs is support parameter control in the config like this:
note, those keys are just what my autocompletion suggested.
lots of different backends i.e. koboldcpp has openai compatible url for their api