Closed dvignacioglobal closed 2 months ago
Hello again,
I just opened the config.yml file and downloaded the necessary Ollama models by using
ollama pull <LLM>
So if I want to use Llamaindex, I will have to do:
ollama pull adrienbrault/nous-hermes2theta-llama3-8b:q5_K_M
since it is the LLM specified in the config.yml file
Hello and good day!
How do you pull the LLM model specified in config.yml?
I have already installed the needed requirements in the .env_llamaindex venv. I want to try using the
vprocessor
agent and it says in the README thatNow, I'm at the part of the README that says I need to pull the LLM model from the config.yml file. How do I use the config.yml file?
Thank you and have a great day.