continuedev / continue

⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
https://docs.continue.dev/
Apache License 2.0
19.43k stars 1.69k forks source link

vllm provider #3021

Open DanielBeck93 opened 23 hours ago

DanielBeck93 commented 23 hours ago

Before submitting your bug report

Relevant environment info

- OS: Windows 11
- Continue version: 0.8.57
- IDE version: 1.95.1
- Model: Llama 3.1 8B
- config.json:  {
      "title": "Llama 3.1 8B",
      "model": "ged4",
      "apiBase":  "some_url",
      "completionOptions": {
      "temperature": 0.1,
      "topK": 1,
      "topP": 1,
      "presencePenalty": 0,
      "frequencyPenalty": 0
      },
      "provider": "vllm",
      "apiKey": "some_api_key"
  },

Description

Hi, I have deployed an Llama3 model thourgh vllm. When I choose vllm as provider in my config file I get:

Error streaming diff: TypeError: Cannot read properties of undefined (reading 'toLowerCase')

when pushing apply or trying to edit the code, or commenting the code through Continue. But when changing the provider to openai it works well. Is this expected?

To reproduce

config: { "title": "Llama 3.1 8B", "model": "ged4", "apiBase": "some_url", "completionOptions": { "temperature": 0.1, "topK": 1, "topP": 1, "presencePenalty": 0, "frequencyPenalty": 0 }, "provider": "vllm", "apiKey": "some_api_key" },

Log output

Error streaming diff: TypeError: Cannot read properties of undefined (reading 'toLowerCase')