crewAIInc / crewAI

Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
https://crewai.com
MIT License
19.72k stars 2.73k forks source link

[BUG] LLM "ollama/llama3.1" does not obey BASE_URL Reopening #1337 #1354

Closed balanceainetwork closed 4 days ago

balanceainetwork commented 1 week ago

Description

We are still facing the same issue as explained in [BUG] LLM "ollama/llama3.1" does not obey BASE_URL #1337

Steps to Reproduce

See LLM "ollama/llama3.1" does not obey BASE_URL #1337

Expected behavior

LLM "ollama/llama3.1" does not obey BASE_URL #1337

Screenshots/Code snippets

LLM "ollama/llama3.1" does not obey BASE_URL #1337

Operating System

Ubuntu 20.04

Python Version

3.10

crewAI Version

latest

crewAI Tools Version

0.12.1

Virtual Environment

Venv

Evidence

LLM "ollama/llama3.1" does not obey BASE_URL #1337

Possible Solution

LLM "ollama/llama3.1" does not obey BASE_URL #1337

Additional context

LLM "ollama/llama3.1" does not obey BASE_URL #1337

joaomdmoura commented 1 week ago

The Env var should be OPENAI_BASE_URL, does that still doens't work? You can also use the LLM class directly: https://docs.crewai.com/core-concepts/LLMs/#using-ollama-local-llms

balanceainetwork commented 1 week ago

That's what I am trying to do:

default_llm = LLM(model="ollama/llama3.1", base_url="http://localhost:11434"),

a = Agent( role=agent_config['role'], goal=agent_config['goal'], backstory=agent_config['backstory'], llm=default_llm, max_iter=2, tools=tools,

    )

I get the following errors:

llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable

lm.py-llm:88 - ERROR: LiteLLM call failed: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=(<crewai.llm.LLM object at 0x128f49eb0>,)

Am I doing something wrong? I have the latest 0.63.5 version installed.

Thank you for any help @joaomdmoura

joaomdmoura commented 1 week ago

I see an extra comma at the end of the default_llm line you pasted, if that in the code? it could be the issue

joaomdmoura commented 6 days ago

new version 0.64.0 is out. I didn't directly addressed this given I was not able to replicate it, but it could be that comma in there

balanceainetwork commented 6 days ago

@joaomdmoura works now, it was either the comma (my bad) or the new version.

Anyway thank you for your help and keep up the good work :)

joaomdmoura commented 4 days ago

No worries!