The health check that is executed by the (at the least) vscode plugin prevents the use of any other openai api compatible llm endpoints.
The explicit use case I was testing was using the tabby plugin for vscode with my localai endpoint
Unfortunately the localai health location is /healthz and returns the plain text OK
I have been unable to find a way to disable this check within the configuration of the client and there is no documentation for this otherwise
The health check that is executed by the (at the least) vscode plugin prevents the use of any other openai api compatible llm endpoints.
The explicit use case I was testing was using the tabby plugin for vscode with my localai endpoint Unfortunately the localai health location is
/healthz
and returns the plain textOK
I have been unable to find a way to disable this check within the configuration of the client and there is no documentation for this otherwise
Please reply with a 👍 if you want this feature.