JudiniLabs / code-gpt-docs

Docusaurus page
https://code-gpt-docs.vercel.app
MIT License
553 stars 58 forks source link

CodeGPT Ollama emit "API error" problem #217

Closed Sunao-Furukawa closed 1 month ago

Sunao-Furukawa commented 6 months ago

MacBook Air (m1) masOS Sonoma 14.2

CODEGPT 2.2.10 Ollama 0.1.17 select Ollama as the provider (Api Key) Ollama models. select mistral select mistral:instruct in the Autocomplete: Provider option Enable CodeGPT Copilot selected

Chat Mode USER hello CODEGPT | Ollama Ollama: API Error

above emitted. I don't know what to do.

PilarHidalgo commented 1 month ago

I'm sorry to hear that you're experiencing an "API error" with the Ollama model in CodeGPT. Here are a few steps you can try to resolve this issue:

Ensure that you have the correct API key for Ollama and that it's properly configured in CodeGPT. Make sure that your Ollama model is correctly installed and running locally on your computer. You can do this by running the command ollama pull in your terminal. If the Ollama model does not respond in the chat, consider restarting it locally by turning it off and then on again. This action should resolve the issue. If the Ollama is running but not responding, please manually remove 'Ollama_Host' from the environment variables and let it revert to the default setting. If these steps don't resolve your issue, I recommend checking the Ollama Documentation for more information or reporting the issue to the CodeGPT team for further assistance.