Closed krrishdholakia closed 7 months ago
Hey @krrishdholakia thanks for asking. I also see your PR and will review it soon. So the idea is that we want users to wrap up their own LLM modules (APIs, proxies, local LLMs, etc.) and directly import from the API class. This is required in security domain as ppl may want to use their own models. Integrating everything together is also a good idea, especially for those standard modules. I'll test the litellm package first and see how to proceed.
Hey @GreyDGL sounds good - let me know if you think anything is missing, happy to make modifications (to PR and package)
Hey @GreyDGL interesting approach to abstracting llm providers - by creating separate classes
chatgpt_api.py
andgpt4all_api.py
Why do it this way - vs. making the completion call inside the init.py?