⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
recently I've tried to use Continue plugin with GoLand IDE using a locally hosted Meta-Llama-3.1-70B-Instruct-FP8-KV model on vllm. Unfortunately, I've found out that tab autocompletion was not working properly, due to a JS issue in the Continue core. Using tcpdump I've found out that no request was sent to my vllm instance.
The given error was:
Error generating autocompletion: TypeError: Cannot read properties of undefined (reading 'toLowerCase')
at getTemplateForModel (/Users/pdanek/Downloads/continue-0.0.82-jetbrains/core/autocomplete/templating/AutocompleteTemplate.ts:334:32)
at getTemplate (/Users/pdanek/Downloads/continue-0.0.82-jetbrains/core/autocomplete/templating/index.ts:77:10)
at renderPrompt (/Users/pdanek/Downloads/continue-0.0.82-jetbrains/core/autocomplete/templating/index.ts:103:5)
at CompletionProvider.provideInlineCompletionItems (/Users/pdanek/Downloads/continue-0.0.82-jetbrains/core/autocomplete/CompletionProvider.ts:189:61)
at process.processTicksAndRejections (/Users/pdanek/Downloads/continue-0.0.82-jetbrains/binary/lib/internal/process/task_queues.js:105:5)
at async /Users/pdanek/Downloads/continue-0.0.82-jetbrains/binary/out/index.js:522340:23
at async /Users/pdanek/Downloads/continue-0.0.82-jetbrains/binary/out/index.js:522925:28 {stack: 'TypeError: Cannot read properties of undefine….0.82-jetbrains/binary/out/index.js:522925:28', message: "Cannot read properties of undefined (reading 'toLowerCase')"}
Please notice that I've removed apiBase content from the above snippet for purpose. It is correctly set in my enviroment.
While debugging the core server locally I've found out that this error was thrown from this function:
export function getTemplateForModel(model: string): AutocompleteTemplate {
const lowerCaseModel = model.toLowerCase();
Because model name was undefined, as it was passed from HelperVars.
In order to get my setup working I've fixed this issue as in PR, because I've discovered that model name is somehow loaded from config to completionOptions object.
I'm not actually sure if this is the right way to fix it, as I'm not really familiar with your codebase and typescript is not my first language of choice, but feel free to modify this solution in any way you need it to get the plugin working with remote models.
After this fix and rebuilding JetBrains plugin autocompletion started to work.
Description
Hello,
recently I've tried to use Continue plugin with GoLand IDE using a locally hosted Meta-Llama-3.1-70B-Instruct-FP8-KV model on vllm. Unfortunately, I've found out that tab autocompletion was not working properly, due to a JS issue in the Continue core. Using tcpdump I've found out that no request was sent to my vllm instance.
The given error was:
My core config was:
Please notice that I've removed
apiBase
content from the above snippet for purpose. It is correctly set in my enviroment.While debugging the core server locally I've found out that this error was thrown from this function:
Because model name was
undefined
, as it was passed fromHelperVars
.In order to get my setup working I've fixed this issue as in PR, because I've discovered that model name is somehow loaded from config to
completionOptions
object.I'm not actually sure if this is the right way to fix it, as I'm not really familiar with your codebase and typescript is not my first language of choice, but feel free to modify this solution in any way you need it to get the plugin working with remote models.
After this fix and rebuilding JetBrains plugin autocompletion started to work.