Open nkay28 opened 1 month ago
Hi @nkay28 ,
Also, can we add Q-A data into the same workspace alongside a set of PDFs? Or is it recommended to add them in separate workspaces only?
I would say it depends on the use case.
It is possible to add Q-As and PDFs but the workspace query will only return the chunks of text relevant to the query and re-ranked by the Cross encoder model (only 3 results are added to the context) .
I would recommend to see what are the documents retuned (In the playground, the cog can be used to show the metadata). Alternatively, you can also use the semantic search page to test.
Q-A content Is it a fixed list of questions? It is it, it might be an example you would like to send as part of the prompt every time?
If yes, a possible option is to update the system prompt and list them there so they are always sent as example? https://github.com/aws-samples/aws-genai-llm-chatbot/blob/main/lib/model-interfaces/langchain/functions/request-handler/adapters/bedrock/base.py#L53
Note if you follow this path, there is a pending change refactoring this part: #576
Hi @charles-marion ,
That's very insightful. Only top 3 would definitely cause a limitation with a heterogenous mix of docs and Q-A, esp. on data with noise.
Yes, I will run some tests after looking into the ranking workflow. It was pulling up only the questions (and no answers) when I tested out via the Sematic search for some reason.
Good point, though I would be concerned with token limitations on the private instance route and token costs overall with that approach. Unless I'm mistaken in my understanding?
Thanks a lot for your valuable insights and suggestions. Appreciate it!
Hi, What is the best way/format to index a set of Questions and Answers into a workspace?
Also, can we add Q-A data into the same workspace alongside a set of PDFs? Or is it recommended to add them in separate workspaces only? I tried together, and the RAG doesn't seem to pick up Q-A content while chatting. So, I'm trying to figure out if my indexing is correct or not. Thank you.