Closed PiotrEsse closed 6 months ago
Hi there!
Thank you for creating this issue.
we don't actually need to define the llm anywhere manually, just make sure you have your "OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxx" in your environment variables.
Please, let me know if you have any other questions.
Also, added more detail regarding this issue in the readme.md file.
thanks
Thank you. I am using Ollama as local LLM provider that's my question.
Oh okay, this might help then: https://docs.crewai.com/how-to/LLM-Connections/#crewai-agent-overview
In #2, I've added Ollama, add these lines at the top of main.py:
# NOTE: to find which model names you have, use cli tool: `ollama list`
llm = ChatOpenAI(
model='llama2',
base_url="http://localhost:11434/v1",
api_key="NA"
)
then at the end of each Agent() function you add:
allow_delegation=True,
llm=llm,
)
If it doesn't work, make sure you ran ollama serve
in a terminal. You can also look at page: http://localhost:11434/ and it should say "Ollama is running" to confirm. Hope it helps.
Hi, where to define llm model? I cant find a place where we can define llm.