Closed nossebro closed 1 month ago
Thanks for the report, will take a look. OpenAI made breaking changes to their API schema to support GPTs and it looks like proxies broke in the process of updating to support the new API schema.
Yeah, that commit broke apiBaseUrl by mistake, apologies. It will be fixed in the next release, and there will be a new UI page for working with local LLMs.
Fixed in v3.23.0
I am using oobabooga's text-generation-webui with its openai extension to serve local llms for ChatGPT Reborn to use.
The only setting I have changed is
"chatgpt.gpt3.apiBaseUrl": "http://127.0.0.1:5000/v1"
, and I have provided my own API keyVersions v3.20.0 works correctly.
Maybe this is to blame: https://github.com/Christopher-Hayes/vscode-chatgpt-reborn/blame/41107ed508a050d42d995c06334ca1279c7c4f3d/src/api-provider.ts#L34