Closed wwjCMP closed 1 month ago
@wwjCMP Thanks for providing those plugins, I'll look into them. If I understand correctly, you want to use Canvas as a no-code tool to build an arbitrary "langchain-like" chain? What you described looks like a Summary chain that does incremental summaries at each step and provides answers.
I think this is super interesting, looks like an epic feature for power users, and may have some learning curve. Adding it to the roadmap for further discussions.
Coming back to the exact use case you described, I think it can be achieved with a "conversation summary memory" and some manual back-and-forth chatting. I've been wanting to add this "summary memory" for a while, it basically summarizes the previous chat messages to compress the info at each turn. This way you can manually send subsets of files at a time and achieve what you described in Chat mode. It is a lossy compression though, so some important points may get lost in this process. I'm thinking of add it as a toggle in Copilot settings Enable summarization of previous messages and avoid reaching the context window limit
, something like that.
Yes, it's like langchain or even more like the new langgraph
Linking a relevant discussion https://github.com/logancyang/obsidian-copilot/discussions/300
Obsidian already has some plugins that utilize canvas for flowchart creation to achieve (semi) automated execution. Examples include:
For instance, using canvas to create a simple flowchart. Due to token limitations, it's not possible to send a large number of files to LLM all at once. Here's a proposed workflow:
While such questioning can be done manually, having a modularized canvas workflow could save a significant amount of effort. This is just a simple example, and with effective modules, users can unleash their creativity to create customized questioning workflows. I believe this could make large models a valuable assistant for managing a large volume of notes.