logancyang / obsidian-copilot

THE Copilot in Obsidian
https://www.obsidiancopilot.com/
GNU Affero General Public License v3.0
3.13k stars 222 forks source link

[EPIC FR] Using canvas to create a flowchart for automated questioning #271

Closed wwjCMP closed 1 month ago

wwjCMP commented 10 months ago

Obsidian already has some plugins that utilize canvas for flowchart creation to achieve (semi) automated execution. Examples include:

For instance, using canvas to create a simple flowchart. Due to token limitations, it's not possible to send a large number of files to LLM all at once. Here's a proposed workflow:

  1. Have the large model read a portion of information and provide a preliminary response to the questions.
  2. Read in another set of information and incorporate the response from the previous step. Combine the two sets of information to generate a comprehensive answer.
  3. Repeat the process until all provided file contents are read.

While such questioning can be done manually, having a modularized canvas workflow could save a significant amount of effort. This is just a simple example, and with effective modules, users can unleash their creativity to create customized questioning workflows. I believe this could make large models a valuable assistant for managing a large volume of notes.

logancyang commented 10 months ago

@wwjCMP Thanks for providing those plugins, I'll look into them. If I understand correctly, you want to use Canvas as a no-code tool to build an arbitrary "langchain-like" chain? What you described looks like a Summary chain that does incremental summaries at each step and provides answers.

I think this is super interesting, looks like an epic feature for power users, and may have some learning curve. Adding it to the roadmap for further discussions.

Coming back to the exact use case you described, I think it can be achieved with a "conversation summary memory" and some manual back-and-forth chatting. I've been wanting to add this "summary memory" for a while, it basically summarizes the previous chat messages to compress the info at each turn. This way you can manually send subsets of files at a time and achieve what you described in Chat mode. It is a lossy compression though, so some important points may get lost in this process. I'm thinking of add it as a toggle in Copilot settings Enable summarization of previous messages and avoid reaching the context window limit, something like that.

wwjCMP commented 10 months ago

Yes, it's like langchain or even more like the new langgraph

logancyang commented 9 months ago

Linking a relevant discussion https://github.com/logancyang/obsidian-copilot/discussions/300

wwjCMP commented 8 months ago

https://github.com/joaomdmoura/crewAI https://www.bilibili.com/video/BV1pu4y1k72Z/?spm_id_from=333.337.search-card.all.click&vd_source=4da716c2c9e733b14045d5c5b12eb8c1 Some related implementations

wwjCMP commented 7 months ago

https://github.com/langflow-ai/langflow