Closed lstep closed 1 month ago
have pretty much the same error. On arch linux
It's because the embedding happens using a different type of model. Try turning on the legacy transformers option in the settings 🌴
It's because the embedding happens using a different type of model. Try turning on the legacy transformers option in the settings 🌴
I already had it checked. But I forced refreshed all (and reimported, using the button just above), and everything went fine! Thanks
Hello, I'm using Smart Connections with a remote OLLAMA instance. I configured it as following:
Using Custom Local (OpenAI format)
It works fine for general questions:
But as soon as I ask a question that requires access to the notes, I get a popup showing an error:
I don't understand why do I get this message, especially because I've configured a remote API query, so why would it ask for a webgpu?