Closed juud79 closed 1 year ago
Hello @juud79 Of course you can use your own local LLM.
Here is detailed documentation about how to use Local LLM at RAGchain. Link
In a nutshell, you can use vLLM or LocalAI to use Local LLM. You can run openai-like api server, which means you can use your local LLM with openai python library. You just simply change api_key and api_base of openai to your own server.
There is an colab example for using local LLM and colab.
no way to use local llm on RAGchain?
like kullm koalpaca...
only open ai... lol