brianpetro / obsidian-smart-connections

Chat with your notes & see links to related content with AI embeddings. Use local models or 100+ via APIs like Claude, Gemini, ChatGPT & Llama 3
https://smartconnections.app
GNU General Public License v3.0
2.83k stars 185 forks source link

Requirement for a WEBGPU although I'm using a remote OLLAMA configuration (`Using Custom Local (OpenAI format)`) #832

Closed lstep closed 1 month ago

lstep commented 1 month ago

Hello, I'm using Smart Connections with a remote OLLAMA instance. I configured it as following:

Using Custom Local (OpenAI format)

It works fine for general questions: image

But as soon as I ask a question that requires access to the notes, I get a popup showing an error:

According to my notes, what is Pixel's brithday?

image

I don't understand why do I get this message, especially because I've configured a remote API query, so why would it ask for a webgpu?

OneEyedOwl11 commented 1 month ago

have pretty much the same error. On arch linux

brianpetro commented 1 month ago

It's because the embedding happens using a different type of model. Try turning on the legacy transformers option in the settings 🌴

lstep commented 1 month ago

It's because the embedding happens using a different type of model. Try turning on the legacy transformers option in the settings 🌴

I already had it checked. But I forced refreshed all (and reimported, using the button just above), and everything went fine! Thanks