Closed balanceainetwork closed 4 days ago
The Env var should be OPENAI_BASE_URL
, does that still doens't work?
You can also use the LLM class directly: https://docs.crewai.com/core-concepts/LLMs/#using-ollama-local-llms
That's what I am trying to do:
default_llm = LLM(model="ollama/llama3.1", base_url="http://localhost:11434"),
a = Agent( role=agent_config['role'], goal=agent_config['goal'], backstory=agent_config['backstory'], llm=default_llm, max_iter=2, tools=tools,
)
I get the following errors:
llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable
lm.py-llm:88 - ERROR: LiteLLM call failed: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=(<crewai.llm.LLM object at 0x128f49eb0>,)
Am I doing something wrong? I have the latest 0.63.5 version installed.
Thank you for any help @joaomdmoura
I see an extra comma at the end of the default_llm
line you pasted, if that in the code? it could be the issue
new version 0.64.0 is out. I didn't directly addressed this given I was not able to replicate it, but it could be that comma in there
@joaomdmoura works now, it was either the comma (my bad) or the new version.
Anyway thank you for your help and keep up the good work :)
No worries!
Description
We are still facing the same issue as explained in [BUG] LLM "ollama/llama3.1" does not obey BASE_URL #1337
Steps to Reproduce
See LLM "ollama/llama3.1" does not obey BASE_URL #1337
Expected behavior
LLM "ollama/llama3.1" does not obey BASE_URL #1337
Screenshots/Code snippets
LLM "ollama/llama3.1" does not obey BASE_URL #1337
Operating System
Ubuntu 20.04
Python Version
3.10
crewAI Version
latest
crewAI Tools Version
0.12.1
Virtual Environment
Venv
Evidence
LLM "ollama/llama3.1" does not obey BASE_URL #1337
Possible Solution
LLM "ollama/llama3.1" does not obey BASE_URL #1337
Additional context
LLM "ollama/llama3.1" does not obey BASE_URL #1337