Closed mgorjis closed 1 week ago
Installed the model locally via Ollama, and ran the ollama serve successfully in the command line. but keep seeing Ollama is not running in your IDE error on VScode and chat/other functionalities are disabled
ollama serve
Ollama is not running in your IDE
Hi @mgorjis , let us know if you have gotten past this issue after removing the conflicting extension in your vscode, thanks .
.
Installed the model locally via Ollama, and ran the
ollama serve
successfully in the command line. but keep seeingOllama is not running in your IDE
error on VScode and chat/other functionalities are disabled