Marker-Inc-Korea / RAGchain

Extension of Langchain for RAG. Easy benchmarking, multiple retrievals, reranker, time-aware RAG, and so on...
Apache License 2.0
279 stars 27 forks source link

no way to use local llm? #283

Closed juud79 closed 1 year ago

juud79 commented 1 year ago

no way to use local llm on RAGchain?

like kullm koalpaca...

only open ai... lol

vkehfdl1 commented 1 year ago

Hello @juud79 Of course you can use your own local LLM.

Here is detailed documentation about how to use Local LLM at RAGchain. Link

In a nutshell, you can use vLLM or LocalAI to use Local LLM. You can run openai-like api server, which means you can use your local LLM with openai python library. You just simply change api_key and api_base of openai to your own server.

vkehfdl1 commented 1 year ago

There is an colab example for using local LLM and colab.