justyns / silverbullet-ai

Plug for SilverBullet to integrate LLM functionality
https://ai.silverbullet.md/
GNU Affero General Public License v3.0
29 stars 1 forks source link

Queried content is not included #27

Closed stefanku closed 3 months ago

stefanku commented 5 months ago

This isn't a real issue I guess, but probably a question/suggestion. The AI plug appears not to include live queried content. I use these queries to collect tasks, list projects and other objects. Sometimes also as widgets. Would it be possible to have this queried content included in prompts to the AI-provider, without the need to bake the queries?

justyns commented 5 months ago

hey @stefanku , do you have an example workflow of what you're wanting to do by any chance? And are you using the 'chat on page' feature, or using one of the other commands to trigger the request?

Templated Prompts should automatically render/bake queries, but I don't think the chat option does right now. It shouldn't be hard to add, but I'm wondering how best to handle if someone doesn't want their queries to be rendered. The only use case I can think of it affecting is if you want to ask the llm for help modifying an existing SB query/template and want it to see the raw version. But maybe a simple command that can toggle query baking on/off would be good enough.

justyns commented 3 months ago

This is available and enabled by default in the main branch now. @stefanku can you give it a try and see if it's what you were hoping for? It won't bake the queries on the page itself, but will render them in the message sent to the llm.

It can also be toggled off by setting bakeMessages to false.

stefanku commented 3 months ago

Great, thank you!

meain commented 3 months ago

@justyns Not sure if this is intentional or a bug but AI: Call OpenAI with Note as context.md does not seem to bake the templates before sending out.

Since I'm here, posting my use-case. I use it to get tasks I did last week and summarize for standup like things.

justyns commented 3 months ago

@meain

@justyns Not sure if this is intentional or a bug but AI: Call OpenAI with Note as context.md does not seem to bake the templates before sending out.

Since I'm here, posting my use-case. I use it to get tasks I did last week and summarize for standup like things.

The reason is more that I was considering getting rid of "AI: Call OpenAI with Note as context" and replacing it with a templated prompt somehow.

Do you change the prompt when you use it, or is it the same prompt every time? If you do change the prompt often, templated prompts might not be a good replacement.

This doesn't exist yet, but I am considering having the option of templated prompts asking the user for input and using that input in the prompt. Something like this:

---
tags: template, aiPrompt
aiprompt:
  description: "Call AI with the current note as context."
  system: You are an AI Note Assistant.  Help the user with their notes.
  slashCommand: aicall
  chat: true
  inputPrompt: "Enter the prompt for the AI call."
---

**user**: [enrich:true] title: {{@page.name}}
Everything below is the content of the note: 
{{readPage(@page.ref)}}
**assistant**: What are the instructions?
**user**: [enrich:false] {{inputPrompt}}
meain commented 3 months ago

This doesn't exist yet, but I am considering having the option of templated prompts asking the user for input and using that input in the prompt.

That sounds like a good option. I don't want to rely on a templated prompt as I would want to ask different question each time. I treat is like a simpler version of Notion AI chat.

Btw, regarding asking using for "prompts", you might also want to think about asking the user for multiple inputs. I have found the need to ask multiple user prompts in meain/yap is low, but just thought I would give a heads up as you are building it out.

Sample use-case I have in yap's readme: Give me the difference between {{inputOne}} and {{inputTwo}}.