continuedev / continue

⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
https://docs.continue.dev/
Apache License 2.0
19.43k stars 1.69k forks source link

Fix undefined model name #3022

Open dzbanek717 opened 22 hours ago

dzbanek717 commented 22 hours ago

Description

Hello,

recently I've tried to use Continue plugin with GoLand IDE using a locally hosted Meta-Llama-3.1-70B-Instruct-FP8-KV model on vllm. Unfortunately, I've found out that tab autocompletion was not working properly, due to a JS issue in the Continue core. Using tcpdump I've found out that no request was sent to my vllm instance.

The given error was:

Error generating autocompletion:  TypeError: Cannot read properties of undefined (reading 'toLowerCase')
    at getTemplateForModel (/Users/pdanek/Downloads/continue-0.0.82-jetbrains/core/autocomplete/templating/AutocompleteTemplate.ts:334:32)
    at getTemplate (/Users/pdanek/Downloads/continue-0.0.82-jetbrains/core/autocomplete/templating/index.ts:77:10)
    at renderPrompt (/Users/pdanek/Downloads/continue-0.0.82-jetbrains/core/autocomplete/templating/index.ts:103:5)
    at CompletionProvider.provideInlineCompletionItems (/Users/pdanek/Downloads/continue-0.0.82-jetbrains/core/autocomplete/CompletionProvider.ts:189:61)
    at process.processTicksAndRejections (/Users/pdanek/Downloads/continue-0.0.82-jetbrains/binary/lib/internal/process/task_queues.js:105:5)
    at async /Users/pdanek/Downloads/continue-0.0.82-jetbrains/binary/out/index.js:522340:23
    at async /Users/pdanek/Downloads/continue-0.0.82-jetbrains/binary/out/index.js:522925:28 {stack: 'TypeError: Cannot read properties of undefine….0.82-jetbrains/binary/out/index.js:522925:28', message: "Cannot read properties of undefined (reading 'toLowerCase')"}

My core config was:

  "models": [
    {
      "title": "myModel",
      "provider": "vllm",
      "contextLength": 128000,
      "template": "none",
      "model": "amd/Meta-Llama-3.1-70B-Instruct-FP8-KV",
      "apiBase": "<myAPIEndpoint>"
    }
  ],
  "tabAutocompleteModel": {
    "title": "myModel",
    "provider": "vllm",
    "contextLength": 128000,
    "template": "none",
    "model": "amd/Meta-Llama-3.1-70B-Instruct-FP8-KV",
      "apiBase": "<myAPIEndpoint>"
  },

Please notice that I've removed apiBase content from the above snippet for purpose. It is correctly set in my enviroment.

While debugging the core server locally I've found out that this error was thrown from this function:

export function getTemplateForModel(model: string): AutocompleteTemplate {
  const lowerCaseModel = model.toLowerCase();

Because model name was undefined, as it was passed from HelperVars.

In order to get my setup working I've fixed this issue as in PR, because I've discovered that model name is somehow loaded from config to completionOptions object.

I'm not actually sure if this is the right way to fix it, as I'm not really familiar with your codebase and typescript is not my first language of choice, but feel free to modify this solution in any way you need it to get the plugin working with remote models.

After this fix and rebuilding JetBrains plugin autocompletion started to work.

netlify[bot] commented 22 hours ago

Deploy Preview for continuedev failed. Why did it fail? →

Name Link
Latest commit 50e84f7d21dba0ecd312cae355dff8126b2a70a1
Latest deploy log https://app.netlify.com/sites/continuedev/deploys/673f33722eef590008016b82