sourcegraph / cody

Type less, code more: Cody is an AI code assistant that uses advanced search and codebase context to help you write and fix code.
https://cody.dev
Apache License 2.0
2.6k stars 278 forks source link

bug: mainThreadExtensionService.ts:81 [sourcegraph.cody-ai]Unexpected token { in JSON at position 137 #4135

Closed riccardo-unipg closed 2 months ago

riccardo-unipg commented 4 months ago

Version

1.89.1

Describe the bug

Hi, I'm using cody with Ollama with ssh on a remote computer, locally everything works and even remotely everything apparently seemed to work, in fact I can see and choose my ollama models and chat and do autocomplete with them. Sometimes the chats work while other times (without knowing the reason) the response generation is interrupted and this error is presented in the developer tools toolbox: mainThreadExtensionService.ts:81 [sourcegraph.cody-ai]Unexpected token { in JSON at position 137

My remote settings are these:

{ "cody.autocomplete.enabled": true, "cody.autocomplete.languages": { "cody.autocomplete.advanced.provider": "experimental-ollama", "cody.autocomplete.experimental.ollamaOptions": { "url": "http://xxx.xxx.xxx.xxx:11434", "model": "deepseek-coder" } } }

Expected behavior

I don't know what to do because sometimes it responds correctly, other times it doesn't, I can't understand what the problem is

Additional context

No response

dominiccooney commented 4 months ago

Thank you for this feedback. We need lib/shared/src/llm-providers/ollama/completions-client.ts to handle JSON output over buffer boundaries correctly.