⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
when pushing apply or trying to edit the code, or commenting the code through Continue. But when changing the provider to openai it works well. Is this expected?
Before submitting your bug report
Relevant environment info
Description
Hi, I have deployed an Llama3 model thourgh vllm. When I choose vllm as provider in my config file I get:
Error streaming diff: TypeError: Cannot read properties of undefined (reading 'toLowerCase')
when pushing apply or trying to edit the code, or commenting the code through Continue. But when changing the provider to openai it works well. Is this expected?
To reproduce
config:
{ "title": "Llama 3.1 8B", "model": "ged4", "apiBase": "some_url", "completionOptions": { "temperature": 0.1, "topK": 1, "topP": 1, "presencePenalty": 0, "frequencyPenalty": 0 }, "provider": "vllm", "apiKey": "some_api_key" },
Log output
Error streaming diff: TypeError: Cannot read properties of undefined (reading 'toLowerCase')