Closed davidamacey closed 1 month ago
same same. Regardless which OpenAI compatible API endpoint I use. TogetherAI, Anyscale, LiteLLM - you name it. Even https://api.openai.com/v1 doesn't work
"cody.autocomplete.advanced.provider": "unstable-openai",
"cody.autocomplete.experimental.ollamaOptions": {
"url": "http://localhost:11434",
"model": "codellama"
},
"cody.autocomplete.advanced.accessToken": "sk-xxx",
"cody.autocomplete.advanced.serverEndpoint": "https://api.openai.com/v1"
}
Cody VS Code: v1.3.1706715785, MacOS 13.6 (22G120)
This issue is marked as stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed automatically in 5 days.
Version
1.2.1
Describe the bug
Using Cody with a user selected 'unstable-openai' model, I entered the URL and key for my local vLLM or local Ollama server running my model, both within a docker container. The Cody output is as follows:
Expected behavior
vLLm input should be the same as OpenAI as it is compliant with the standard.
I was expecting the vLLM be a drop in for OpenAI ChatGPT, as I have done that for other applications.
Both vLLM and ollama are the latest docker containers.
Additional context
vLLM is running in a docker container on my local network.