sourcegraph / cody

Type less, code more: Cody is an AI code assistant that uses advanced search and codebase context to help you write and fix code.
https://cody.dev
Apache License 2.0
2.64k stars 290 forks source link

bug: cannot configure `unstable-ollama` provider, ignores model and Server Endpoint settings #3074

Closed lukeab closed 2 months ago

lukeab commented 8 months ago

Version

v1.2.3

Describe the bug

When selecting to use a self hosted ollama instance, there is no way to do 2 things:

  1. Set the server endpoint for the ollama instance. in my case I have a desktop machine with a good GPU and run ollama there, when coding on my laptop i want to use the ollama instance on my desktop, no matter what value is set for cody.autocomplete.advanced.serverEndpoint, cody will always attempt to use http://localhost:11434, so i cannot sepcify the ip of my desktop machine hosting ollama.
  2. Use a different model on ollama - no matter what value is set for cody.autocomplete.advanced.model, for example when llama-code-13b is selected, the vscode output tab for cody always says: █ CodyCompletionProvider:initialized: unstable-ollama/codellama:7b-code

Expected behavior

When a esrver endpoint is specified, the ollama configuration should use that value instead of the default http://localhost:11434 when a model is selected it should use that model in the config and request paramteres, instead of always using codellama:7b-code

Additional context

image image shows output for unstable-ollama/codellama:7b-code being used despite 13b being selected.

lukeab commented 8 months ago

Updated the description to correct the cody.autocomplete.advanced.model setting name.

jdorfman commented 8 months ago

Thanks @lukeab for reporting this. We will follow up ASAP.

lakarpusky commented 8 months ago

I am trying to configure Cody with local ollama, the doc says to select the unstable-ollama provider but it does not appear in the list.

VSCode v1.85.2 Cody AI v1.4.4

CleanShot 2024-02-18 at 22 45 03

CleanShot 2024-02-18 at 22 39 02

philipp-spiess commented 8 months ago

@lukeab cody.autocomplete.advanced.serverEndpoint is actually not used when the Ollama provider is configured. Sorry for that (I know it's super confusing and we have to make this easier, sorry!). You want to use the cody.autocomplete.experimental.ollamaOptions option like this:

  "cody.autocomplete.advanced.provider": "experimental-ollama",
  "cody.autocomplete.experimental.ollamaOptions": {
    "url": "http://localhost:11434",
    "model": "deepseek-coder:6.7b-base-q4_K_M"
  },

@lakarpusky Which doc do you refer to? We have renamed the option from unstable-ollama to experimental-ollama, so that doc will need updating I’m afraid.

lakarpusky commented 8 months ago

Thanks @philipp-spiess it worked with that configuration. It was my mistake I meant to say log changelog, but it seems I checked the wrong file.

Does Cody allow to change the chat model? Because I already have the codellama:7b-instruct model downloaded and being able to use the chat offline would be great.

philipp-spiess commented 8 months ago

@lakarpusky Unfortunlaty we don't have support for Ollama in Chat yet. :/

mejoe commented 8 months ago

Is there a plan to enable Ollama for chat?

klimchuk commented 8 months ago

What's local Ollama used for when?

philipp-spiess commented 8 months ago

@mejoe

Is there a plan to enable Ollama for chat?

Yep, but I can't promise a timeline yet. I have created a separate issue to track demand for this though: https://github.com/sourcegraph/cody/issues/3252

@klimchuk

What's local Ollama used for when?

Currently Ollama support is experimental and only available for the autocomplete feature (so for inline code suggestions). A quick guide on how to enable it can be found here: https://github.com/sourcegraph/cody/blob/main/vscode/doc/ollama.md

ByerRA commented 7 months ago

One of the main reasons I was drawn to Cody was the fact that one could use Ollama with it. It's disappointing to see that only "autocomplete" is used with local Ollama models (and this needs to be documented better) and not any of the other features.

jdorfman commented 7 months ago

@ByerRA coming soon =)

image

https://github.com/sourcegraph/cody/pull/3282

janvda commented 7 months ago

@lukeab cody.autocomplete.advanced.serverEndpoint is actually not used when the Ollama provider is configured. Sorry for that (I know it's super confusing and we have to make this easier, sorry!). You want to use the cody.autocomplete.experimental.ollamaOptions option like this:

  "cody.autocomplete.advanced.provider": "experimental-ollama",
  "cody.autocomplete.experimental.ollamaOptions": {
    "url": "http://localhost:11434",
    "model": "deepseek-coder:6.7b-base-q4_K_M"
  },

@lakarpusky Which doc do you refer to? We have renamed the option from unstable-ollama to experimental-ollama, so that doc will need updating I’m afraid.

It is not clear to me how I can add the following to my visual code settings file

"cody.autocomplete.experimental.ollamaOptions": {
  "url": "http://localhost:11434",
  "model": "deepseek-coder:6.7b-base-q4_K_M"
},
lukeab commented 6 months ago

it's great, the cody.autocomplete.experimental.ollamaOptions: {} url and model works now. It would be ideal to have a sense of when experiemental may be merged into advanced? Or what the plan is to include the experimental features into the settings dialogs in vscode? Happy this works and now with PR #3282 answering the need in ticket #3252 things are looking better, would just love them all incorporated into the settings form fields.

jdorfman commented 6 months ago

@lukeab glad you are enjoying it! We don't have any plans on next steps. Right now we are gathering feedback (like this) and creating content to see what's next.

levicki commented 4 months ago

@jdorfman @philipp-spiess For me these options in VS Code 1.90.0 with Cody 1.20.3 no longer seem to work:

    "cody.autocomplete.advanced.provider": "experimental-ollama",
    "cody.autocomplete.experimental.ollamaOptions": {
        "url": "http://localhost:11434",
        "model": "granite-code:34b"
    },
    "cody.autocomplete.languages": {
        "*": true
    },
    "cody.experimental.ollamaChat": true,

Namely, the cody.autocomplete.experimental.ollamaOptions and cody.experimental.ollamaChat show the Unknown Configuration Setting tooltip.

If I click Cody: Generate Code option from lightbulb icon I get No chat model found in server-provided config and output window is showing:

ConfigFeaturesSingleton: refreshConfigFeatures accessing Sourcegraph GraphQL API: Error: HTTP status code 401: Invalid access token.

Note that I am not logged into sourcegraph account, but I am also not getting any auto-completion offers.

Another error I am seeing in earlier output is:

LocalEmbeddingsController: load 5f1385928dff462ab5e748800ac37144 {"code":-32099}

Could you please point me to some updated docs which say what exactly is supported in current version and what are the correct configuration keys now?

hitzhangjie commented 3 months ago

+1

The document is old.

hyharry commented 2 months ago

is there any solution existing right now. it does not work when sign out for VS Code 1.92.0 and Cody 1.28.1

levicki commented 2 months ago

is there any solution existing right now. it does not work when sign out for VS Code 1.92.0 and Cody 1.28.1

Yes, the solution is not to use Cody.

Switch to https://www.continue.dev/ -- it's much better and fully open-source anyway.

hyharry commented 2 months ago

is there any solution existing right now. it does not work when sign out for VS Code 1.92.0 and Cody 1.28.1

Yes, the solution is not to use Cody.

Switch to https://www.continue.dev/ -- it's much better and fully open-source anyway.

Thanks for mentioning! will give a try

hyharry commented 2 months ago

is there any solution existing right now. it does not work when sign out for VS Code 1.92.0 and Cody 1.28.1

it seems one has to log on to have autocomplete working according to link.

when sign in with the right setting. the sourcegraph output gives

CompletionLogger:onComplete: {"type":"code-completion","endpoint":"xxxx","status":"success","duration":188}

which confirms my ollama is working. But sign in with a local service is hhhh

icemagno commented 1 week ago

is there any solution existing right now. it does not work when sign out for VS Code 1.92.0 and Cody 1.28.1

Yes, the solution is not to use Cody.

Switch to https://www.continue.dev/ -- it's much better and fully open-source anyway.

DO NOT DO THIS! Continue is very poor when interactin with your code. It only sugest code into chat tab and not update your code automaticaly as Cursor do. You must copy/paste code from the chat tab and this is very dangerous because it may lead you to make mistakes.

Cody is closer you can be to Cursor IDE. Great code interaction.