Open tietscho opened 1 week ago
None replied for 4 days, have you had any luck getting it to work? I tried today for a bit to get it work, but I just keep getting this error, no matter what I try.
I confirmed my local model, Protocol, Hostname, Port, and path, just won't connect, not sure what else to try.
plugin:smart-connections:2476 Error: net::ERR_CONNECTION_REFUSED at SimpleURLLoaderWrapper.<anonymous> (node:electron/js2c/browser_init:2:108522) at SimpleURLLoaderWrapper.emit (node:events:517:28) handle_error @ plugin:smart-connections:2476 complete @ plugin:smart-connections:2368 await in complete (async) new_user_message @ plugin:smart-connections:4689 await in new_user_message (async) new_user_message @ plugin:smart-connections:12969 handle_send @ plugin:smart-connections:12827 eval @ plugin:smart-connections:2849
What configuration are you using? Can you share a screenshot? 🌴
trying to run it with local model, so I chose Custom Local
oh, man, Ollama is different from any other local LLM clients I know. Try openAI compatible endpoint ones.
Did you setup 'ollama serve' ? Also path should be api/chat (at least this solved my issue)
How am i supposed to use Ollama with this?