continuedev / continue

⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
https://docs.continue.dev/
Apache License 2.0
18.29k stars 1.48k forks source link

Getting strange reponces from model suddenly? #1596

Closed Emoter1979 closed 3 months ago

Emoter1979 commented 3 months ago

Before submitting your bug report

Relevant environment info

- OS:w11
- Continue:latest release 0.8.24 - 2024-04-12
- IDE:vscode

Description

Running Ollama and Model CodeLlama command "ollama run CodeLlama"

It prompts and works for a few minutes, then I'm getting strange responses like:

Prompt: "hi"

Output: " 10 "

To reproduce

see description

Log output

unused_17> <unused_8><unused_13><unused_2><unused_1>11<unused_8><unused_12>▅<unused_19><unused_6><unused_17><unused_19><unused_3><unused_19><unused_3>
sestinj commented 3 months ago

@Emoter1979 it looks like you might have an older release version for some reason. Our latest on the VS Code extension marketplace is 0.8.40. Is there any chance that you have an older version of VS Code than 0.1.70?

Emoter1979 commented 3 months ago

Sorry, removed vscode and all plugins and clean all out. then reinstalled it again and now it works.

Emoter1979 commented 3 months ago

Sorry, removed vscode and all plugins and clean all out. then reinstalled it again and now it works.