Open dhruv-shipsy opened 2 months ago
Hi @dhruv-shipsy, thank you so much.
In the settings section, if we are running ollama on a different port we can use that port.
Or else we can change the strategy to gemini if we want to use LLM which is not local and enter the API key.
Failed to send request to Ollama API: error sending request for url (http://localhost:11434/v1/chat/completions): error trying to connect: tcp connect error: Connection refused (os error 61)