Open cph0r opened 1 month ago
we don't have a rag api @cph0r
what are you referring to?
if you want it as a pass through, this should already be possible - https://docs.litellm.ai/docs/proxy/pass_through
I am building an application for my org using librechat which uses litellm proxy to connect to aws bedrock we were successfully able to connect to models like anthropic claude sonnet and haiku an do simple qureies, for obvious reasons the company wanted a model which is context aware with their data, this is possible through bedrock knowledge-base which can be queried like this, I was hoping if litellm can natively support querying the bedrock knowledge base just like it queries other bedrock models
hey @krrishdholakia any update if this is possible to do if not will it planned for in future updates ?
Hey @cph0r missed this - we've had another request for this. I'll look into it this week
@krrishdholakia this sounds great, keep me posted on this ☝️
@krrishdholakia I see you have mentioned a pass through for knowledge base in this example -> https://github.com/BerriAI/litellm/blob/9df0588c2c816ae4e58db9c0122613009e1ae75b/docs/my-website/docs/pass_through/bedrock.md?plain=1#L136
can this be used for accessing knowledge-base via librechat ? if yes is there some example usage I can refer to ?
The Feature
For rag api integration, it would be great if we can have knowledge base integration where we can add knledgebase ID and model name we want to use along with aws creds in the config file and we can connect with a knowledge base directly
Motivation, pitch
Rag integration is difficult to implement for big organisations. Aws offers a solution to this with their knowledgebase apis, it would be a great feature to be added in the litellm where users can provide the config required to connect with aws knowledgebase and benefit from it directly rather than creating a separate rag api of their own
Twitter / LinkedIn details
No response