Closed moodler closed 1 year ago
Have a read of the docs. I think this is useful for being able to easily switch backend AIs, for example to a locally-installed Llama.
There is only one API call to GPT in the entire codebase, and we are not planning on switching LLM anytime soon.
Not sure if this is worth the refactor?
The idea of the backend is to make it easier to switch LLMs on a whim to try them, eg an open source model running locally or on huggingface etc which will be much better than being tied to (and force downstream users to be tied to) OpenAI.
So please try this out.
Langchain free course here https://www.deeplearning.ai/short-courses/
See https://github.com/hwchase17/langchain ... it seems to be a great project to build on as it helps to abstract away the LLM being used and allows us to switch to new ones much easier. They also have extra features for memory etc, and possibly things like fact-checking (eg Wolfram Alpha)
https://langchain.readthedocs.io/en/latest/