Closed philipp-spiess closed 4 months ago
+1. I'd like to be able to use a local Ollama API.
+1 yes, absolutely, and it's pretty possible, recommend having seperate model selection setting from code completion and chat so for example: codellama:7b-instruct
can be used for chat and codellama:7b-code
seperately selected for autocompletion.
+1 YES!!!!
I think having the ability to have ALL of Cody's functions able to be run from Ollama would be of great benefit and would also draw more people to use Cody as I believe more and more people are going to want to run local models.
+1 big ollama user here, I'd love cody + ollama for use on flights / bad internet connections / code security
Following this guide: https://github.com/sourcegraph/cody/blob/main/vscode/doc/ollama.md
I just tried using Cody offline with the ollama flag on and it was still trying to hit Cody's servers it looks like?
"cody.experimental.ollamaChat": true
Request Failed: getaddrinfo ENOTFOUND sourcegraph.com
Is the ollama integration meant for Cody to work completely offline?
I just tried using Cody offline with the ollama flag on and it was still trying to hit Cody's servers it looks like?
"cody.experimental.ollamaChat": true
Request Failed: getaddrinfo ENOTFOUND sourcegraph.com
Is the ollama integration meant for Cody to work completely offline?
Currently internet connection is still required for the authentication step, but please feel free to file a feature request so I can bring it to my team 😀
I think I figured out how to use ollama! (I needed to create a new chat and set the bot to be ollama ) Thank you so much for the work you put into this! 🥳 🥳
Thank you for trying it out 🤗
We have a blog post with instructions on how to set up Ollama for chat if anyone else is interested in giving it a try 😃 https://sourcegraph.com/blog/cody-vscode-1-8-0-release
Creating an issue to track the demand for ollama support for chat models.