crewAIInc / crewAI-tools

MIT License
636 stars 215 forks source link

Allow for LM Studio to be usable for RAG instead of just open AI #29

Open ian-andrich opened 6 months ago

ian-andrich commented 6 months ago

Currently the RAG base class composes with the official OpenAI Python Package. It is desirable to instead only use the calling convention of OpenAI. This allows for using Local LLMs (such as Llama 3) for producing embeddings in conjunction with Open AIs calling conventions.

joaomdmoura commented 6 months ago

I believe you can now do this using this strategy on the docs: https://docs.crewai.com/tools/PDFSearchTool/#custom-model-and-embeddings

landersdev commented 6 months ago

You can also set local .env variables to configure the client for local LLM use.

OPENAI_API_BASE=http://localhost:1234/v1
OPENAI_MODEL_NAME=TheBloke/Mistral-7B-Instruct-v0.1-GGUF/mistral-7b-instruct-v0.1.Q2_K.gguf
OPENAI_API_KEY=na

This is working for me locally using LM Studio.