Open ChamathRosh opened 6 months ago
@ChamathRosh not as of yet, I believe we would need to integrate with the OpenAI assistants API to support this functionality. I am looking into it now.
You could also use a vector DB like pinecone for this, build a retriever and make a function that takes the retrieval from the DB and turns is into a string as Context: ${contextGottenFromPinecone}
and you can append that context to the LLM prompt whenever queriyng for the next interaction
Is there any way that we can give a knowledgebase from pdf or any other format