Closed GhosTHaise closed 2 months ago
Hi @GhosTHaise, please follow this instructions to connect Ollama properly https://docs.codegpt.co/docs/tutorial-ai-providers/ollama
Hello! please ensure that Ollama is running locally and download the model that you are using
@PilarHidalgo , @davila7 , @gustavoespindola , @psmyrdek My ollama is running on the background and model is already download .
Is there a specific host(ip address like 127.0.0.1 as localhost) or specific port that I should config to ollama that codeGpt is listning?
Hello! please ensure that Ollama is running locally and download the model that you are using
Finally , i have found a solution for my issues. We need to remove manual OLLAMA_HOST to system environment variables and let the default 👍 .
Hello everyone, I'm using codegpt v3.2.5 with ollama on windows.When i try to send one prompt , It says that ollama is not running even if ollama is running on the background like in the picture bellow :
Is there a way to fix this ?