Closed 0Hate closed 6 months ago
Maybe can work out of the box.
Give a try. On extension options, you'll find there the Base Address parameter.
If you set with your local LLM address maybe works.
Tell me something later the result.
Hi @0Hate, I will close this issue, but if necessary you can reopen.
Looks like its calling /v1/v1 when requesting rather than just v1 [2024-02-03 13:15:37.244] [ERROR] Unexpected endpoint or method. (POST /v1/v1/chat/completions). Returning 200 anyway
Hi, I am using this open source tool to run local LLM and it has build in api available using same format as the openai api.
Will this work out of the box with the extension?
https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API
If not, I would like to implement this to the extension, is that OK?