katanaml / sparrow

Data processing with ML, LLM and Vision LLM
https://katanaml.io
GNU General Public License v3.0
3.61k stars 373 forks source link

How to use the config.yml file? #70

Closed dvignacioglobal closed 2 months ago

dvignacioglobal commented 2 months ago

Hello and good day!

How do you pull the LLM model specified in config.yml?

I have already installed the needed requirements in the .env_llamaindex venv. I want to try using the vprocessor agent and it says in the README that

.env_llamaindex is used for LLM RAG with llamaindex, vllamaindex and vprocessor agents,

Now, I'm at the part of the README that says I need to pull the LLM model from the config.yml file. How do I use the config.yml file?

Thank you and have a great day.

dvignacioglobal commented 2 months ago

Hello again,

I just opened the config.yml file and downloaded the necessary Ollama models by using ollama pull <LLM>

So if I want to use Llamaindex, I will have to do: ollama pull adrienbrault/nous-hermes2theta-llama3-8b:q5_K_M since it is the LLM specified in the config.yml file