logancyang / obsidian-copilot

THE Copilot in Obsidian
GNU Affero General Public License v3.0
2.79k stars 191 forks source link

Unable to set up local embedding model for QA function #703

Closed stephen-ondyr closed 1 week ago

stephen-ondyr commented 1 week ago

Describe how to reproduce I am trying to use QA functionality but having issues setting up local embedding. I am using lm studio for chat fine but I keep getting embedding errors. Here's is what I have done

This is the console error I get in Obsidian:

Uncaught (in promise) Error: No embedding model found for: nomic-embed-text|ollama at EmbeddingManager.getEmbeddingsAPI (VM236 plugin:copilot:133693:13) at _ChainManager.setChain (VM236 plugin:copilot:138004:54)

I am able to load the note using Chat mode. But it's not recognizing the active note in long note QA. And eventually I want to set up Vault QA so will need embedding for that.

Expected behavior Long Note QA to recognize active note so I can interact with the content

Screenshots Add screenshots to help explain your problem. Please turn on debug mode in Copilot settings, turn off other plugins to leave only Copilot dev messages as necessary.

image

image

image

Additional context

I've tried a million other things and seemingly every possible combination including Open AI and Cohere embedding models as well as nomic using LM studio as an open model but I must be doing something wrong. Any help appreciated.

stephen-ondyr commented 1 week ago

Nevermind it seems to be working now. No idea why!

logancyang commented 1 week ago

Hmm interesting, something might be wrong and the error seems flaky.

Just FYI, Long Note QA is going away in the next release in favor of Vault QA with explicit [[title]] mention.