Open m1nuzz opened 7 months ago
Is it possible to use a custom link for endpoints like koboldcpp or LM Studio so that a local LLM model can be utilized? Will you add such a capability?
It is possible. There is a foundation for that but some code adjustments are needed to make it work. Pull requests are welcome :)
Is it possible to use a custom link for endpoints like koboldcpp or LM Studio so that a local LLM model can be utilized? Will you add such a capability?