sourcegraph / cody

Type less, code more: Cody is an AI code assistant that uses advanced search and codebase context to help you write and fix code.
https://cody.dev
Apache License 2.0
2.32k stars 251 forks source link

Ollama: Experimental support for chat #3252

Closed philipp-spiess closed 4 months ago

philipp-spiess commented 5 months ago

Creating an issue to track the demand for ollama support for chat models.

cam-s-hunt commented 5 months ago

+1. I'd like to be able to use a local Ollama API.

lukeab commented 5 months ago

+1 yes, absolutely, and it's pretty possible, recommend having seperate model selection setting from code completion and chat so for example: codellama:7b-instruct can be used for chat and codellama:7b-code seperately selected for autocompletion.

ByerRA commented 5 months ago

+1 YES!!!!

I think having the ability to have ALL of Cody's functions able to be run from Ollama would be of great benefit and would also draw more people to use Cody as I believe more and more people are going to want to run local models.

mrdjohnson commented 4 months ago

+1 big ollama user here, I'd love cody + ollama for use on flights / bad internet connections / code security

PieBru commented 4 months ago

Following this guide: https://github.com/sourcegraph/cody/blob/main/vscode/doc/ollama.md image

mrdjohnson commented 4 months ago

I just tried using Cody offline with the ollama flag on and it was still trying to hit Cody's servers it looks like?

"cody.experimental.ollamaChat": true

Request Failed: getaddrinfo ENOTFOUND sourcegraph.com

Is the ollama integration meant for Cody to work completely offline?

abeatrix commented 4 months ago

I just tried using Cody offline with the ollama flag on and it was still trying to hit Cody's servers it looks like?

"cody.experimental.ollamaChat": true

Request Failed: getaddrinfo ENOTFOUND sourcegraph.com

Is the ollama integration meant for Cody to work completely offline?

Currently internet connection is still required for the authentication step, but please feel free to file a feature request so I can bring it to my team 😀

mrdjohnson commented 4 months ago

I think I figured out how to use ollama! (I needed to create a new chat and set the bot to be ollama ) Thank you so much for the work you put into this! 🥳 🥳

abeatrix commented 4 months ago

Thank you for trying it out 🤗

We have a blog post with instructions on how to set up Ollama for chat if anyone else is interested in giving it a try 😃 https://sourcegraph.com/blog/cody-vscode-1-8-0-release