continuedev / continue

⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
https://docs.continue.dev/
Apache License 2.0
17.59k stars 1.36k forks source link

Wrong encoding in responses with local ollama #1214

Open anhonestboy opened 5 months ago

anhonestboy commented 5 months ago

Before submitting your bug report

Relevant environment info

- OS:MacOs 14.3
- Continue: 0.8.25 (tested also 0.9. pre-release)
- IDE: VSCode 1.88.1

Description

Screenshot 2024-05-01 alle 20 39 39

To reproduce

No response

Log output

No response

sestinj commented 5 months ago

@anhonestboy Is there any extra information you can share about your OS or Ollama setup? I have a very similar system to you (Mac 14.3, VS Code 1.89.0, Ollama with codellama:7b), but haven't been able to reproduce the problem

serhatay commented 5 months ago

I had the same issue with llama3:8b and codellama. After the ollama update llama3:8b started working but codellama still returns nonsense