brianpetro / obsidian-smart-connections

Chat with your notes & see links to related content with AI embeddings. Use local models or 100+ via APIs like Claude, Gemini, ChatGPT & Llama 3
https://smartconnections.app
GNU General Public License v3.0
2.81k stars 183 forks source link

Ollama not Working #873

Open tietscho opened 1 week ago

tietscho commented 1 week ago

How am i supposed to use Ollama with this?

misagarcia commented 5 days ago

None replied for 4 days, have you had any luck getting it to work? I tried today for a bit to get it work, but I just keep getting this error, no matter what I try.

I confirmed my local model, Protocol, Hostname, Port, and path, just won't connect, not sure what else to try.

plugin:smart-connections:2476 Error: net::ERR_CONNECTION_REFUSED at SimpleURLLoaderWrapper.<anonymous> (node:electron/js2c/browser_init:2:108522) at SimpleURLLoaderWrapper.emit (node:events:517:28) handle_error @ plugin:smart-connections:2476 complete @ plugin:smart-connections:2368 await in complete (async) new_user_message @ plugin:smart-connections:4689 await in new_user_message (async) new_user_message @ plugin:smart-connections:12969 handle_send @ plugin:smart-connections:12827 eval @ plugin:smart-connections:2849

brianpetro commented 5 days ago

What configuration are you using? Can you share a screenshot? 🌴

misagarcia commented 5 days ago

trying to run it with local model, so I chose Custom Local

Screenshot 2024-11-16 194823 Screenshot 2024-11-16 195038

3-ark commented 5 days ago

oh, man, Ollama is different from any other local LLM clients I know. Try openAI compatible endpoint ones.

Pacura commented 4 days ago

Did you setup 'ollama serve' ? Also path should be api/chat (at least this solved my issue)