genai-apps / aggrag

Other
7 stars 13 forks source link

Whenever a prompt node is added, default should be a "Base" RAG selected in RAGs to query section (and not a GPT3.5 model selected in Models to query section) #80

Closed tarun-goyal closed 1 week ago

tarun-goyal commented 2 weeks ago

image

@garvk Let me know if you think otherwise.

trd-rmit commented 2 weeks ago

Hi @tarun-goyal, I have added a PR updating this behavior. Please let me know if it is good to merge (https://github.com/genai-apps/aggrag/pull/81). I was not able to add a reviewer to the PR.

tarun-goyal commented 2 weeks ago

Hi @tarun-goyal, I have added a PR updating this behavior. Please let me know if it is good to merge (#81). I was not able to add a reviewer to the PR.

Hi @trd-rmit - this is on hold right now. Might need to tweak the requirements a bit. Will update this ticket and put a message on PR once have more clarity.

garvk commented 2 weeks ago

It's unclear whether to:

  1. allow both the LLM and RAG as a default option, and leave it to the user to remove one from the prompt node OR
  2. allow none as a default option, and leave it to the user to add either LLM or RAG from the ragstore. In this case, we could play with the 'hints' that have just been added in the repo. I find hints are really useful.

@trd-rmit @tarun-goyal will appreciate your inputs.

I am, however, leaning towards 2 because when we add RAG, we need to add a context say a file node using the {rag_knowledge_base} connector. But in current scheme of things, file node does nothing to LLM responses soo it might give a false sense of uploaded files being used to generate LLM responses.

trd-rmit commented 1 week ago

Hi @garvk ,

I appreciate both options you've presented, but I am leaning slightly toward having default selections for these settings. Here’s why:

When I first updated a prompt node, I noticed that the absence of defaults could lead to confusion. Users might not fully understand the implications of leaving the RAG section empty, especially if they anticipate a retrieval-based model to enhance their results. Similarly, leaving the LLM section empty could be problematic for those expecting creative generation or responses to general knowledge queries.

Having defaults in place, we offer a starting recommendation that users can easily modify to suit their specific needs. This approach can help guide them, ensuring they achieve the results they’re looking for without unnecessary confusion.

As you mentioned, incorporating hints or tooltips would further enhance the user experience, provide additional guidance, and make the process even more intuitive.

image

Looking forward to your thoughts.

tarun-goyal commented 1 week ago

@garvk @trd-rmit I'm inclined towards Option 1 here, specially for a new user, this "addition of the first LLM/RAG" may not be very straight-forward considering the Play CTA for a prompt node is more prominent/is placed before the addition of models section, and user may hit it accidently right after adding a new prompt node. However, I agree hints would be of great help here, specially after adding a prompt node if we place/sequence the hints in such a way so that all pre-requisites (connection to text/input data nodes, file nodes for RAGs, addition of an API key for LLMs) are covered before a user hits Play CTA.

Thoughts?

garvk commented 1 week ago

Yes, I am happy to back both of you on option 1 since there's a consensus on it and I don't have a strong case for option 2.

However, tooltip as shown by @trd-rmit will be necessary if we go with option 1 i.e show both LLMs and RAGs.

In RAG tooltip, we could add:

" Configuring a RAG is a three step process:

  1. Choose and configure the settings of any RAG from the ragstore.
  2. Create index after connecting a Knowledge Base node with _rag_knowledgebase variable in the Prompt Node.
  3. Add or connect prompts and variables as required by your use case "

In LLM tooltip, we could add: " Configuring an LLM is a two step process:

  1. Choose and configure the settings of an LLM from the modelstore
  2. Add the necessary api key's in settings, the right most icon on top banner
  3. Add or connect prompts and variables as required by your use case "

@trd-rmit would you add the RAG and LLM tooltips as shown in "I am a tooltip" example? Unless any of you have any suggestions on tooltip content, let's go with the content mentioned above.

@tarun-goyal for hints if we still need, we can track that in a separate ticket or the ongoing one?

trd-rmit commented 1 week ago

Hi @garvk

Thanks, I have updated the PR for the above tooltip changes, and also have updated the screenshot for the changes. Please let me know if it is good to merge.

tarun-goyal commented 1 week ago

Yes, I am happy to back both of you on option 1 since there's a consensus on it and I don't have a strong case for option 2.

However, tooltip as shown by @trd-rmit will be necessary if we go with option 1 i.e show both LLMs and RAGs.

In RAG tooltip, we could add:

" Configuring a RAG is a three step process:

  1. Choose and configure the settings of any RAG from the ragstore.
  2. Create index after connecting a Knowledge Base node with _rag_knowledgebase variable in the Prompt Node.
  3. Add or connect prompts and variables as required by your use case "

In LLM tooltip, we could add: " Configuring an LLM is a two step process:

  1. Choose and configure the settings of an LLM from the modelstore
  2. Add the necessary api key's in settings, the right most icon on top banner
  3. Add or connect prompts and variables as required by your use case "

@trd-rmit would you add the RAG and LLM tooltips as shown in "I am a tooltip" example? Unless any of you have any suggestions on tooltip content, let's go with the content mentioned above.

@tarun-goyal for hints if we still need, we can track that in a separate ticket or the ongoing one?

@garvk For the hints, we can use the same ticket.