brianpetro / obsidian-smart-connections

Chat with your notes & see links to related content with AI embeddings. Use local models or 100+ via APIs like Claude, Gemini, ChatGPT & Llama 3
https://smartconnections.app
GNU General Public License v3.0
2.59k stars 176 forks source link

Always getting the context_length_exceeded error. Is transcript parsing possible? #686

Open avons817 opened 3 months ago

avons817 commented 3 months ago

Hi,

I'm a new Obsidian user and was excited to use your plugin. However, I've gotten the below error in my console log every time. I'm using Open AI as the LLM. I only have ~10 notes total in Obsidian. 2 notes are hour-long transcripts.

Would your plugin ever be able to parse the data from multiple transcripts? Ex prompt: Summarize Sally's questions from meetings a,b, and c?

Object code : "context_length_exceeded" message : "This model's maximum context length is 16385 tokens. However, your messages resulted in 34455 tokens (34235 in the messages, 220 in the functions). Please reduce the length of the messages or functions." param : "messages" type : "invalid_request_error" [[Prototype]] : Object

brianpetro commented 3 months ago

Hi @avons817

If you could screenshot the errors next time, it would give me better context for how to help.

In theory, what you want to do should work. In practice, it's a little more complicated than that.

First, you'll want to make sure your parsing blocks. "Block-level" embeddings in the settings should not be set to "None." Use "BGE-micro-v2" or an OpenAI model to start.

You can check if this part is work properly by observing the Smart View with a note open. The Smart View should display dozens of excerpts from your other notes. This will let you know the blocks are working correctly.

If you get the chat error again, screenshot it and I'll look into it further. It would be extra helpful if you can turn on the "debug at startup time" in the Obsidian "community plugins" settings tab. And also if.you could include the exact chat prompt you are using.

🌴