Closed Yembot31013 closed 3 weeks ago
You can define your own Manager Agent since #474, with that you should be able to specify separate llms
Manager Agent, like, how? I want each Agents to have a separate LLM and not a single LLM working for all of them. I know I can assign a manager LLM to my crew when I am using the hierarchy process, but my concern is that, when the manager LLM delegated task to an Agent, will it use the LLM defined in each of the Agent class or use the manager LLM to answer the question and just use the Agent tools. If No, I will suggest they make it separatable since I am having issue with ResourceExhausted
while using gemini so that I can use different API key for each of my Agents and manager LLM separately, but if Yes, I suggest if you can add a new functionality that if a specific or obvious exception error is found, then it should wait for a specific seconds before retrying or maybe exit the whole code execution because some result might be necessary for the perfection of the Crew result. This should be dynamic maybe adding an argument like error_handler_config: List[dict]
. where the dict
will contain:
{
"exception": list[an Exception class] #contain list to exception class to catch,
"callback": method function from a Buildin Class #A buildin method that either wait based on custom/default seconds, exit the own program, run a custom function, e.t.c (what ever you want)
}
something like this. Thanks
Every Agent Object including the Manager already has an llm
parameter you can set independently.
https://docs.crewai.com/core-concepts/Agents/
agent = Agent(
role='Data Analyst',
goal='Extract actionable insights',
backstory="...",
llm=my_llm, # here
)
Also you might want to look into asynchronous task execution depending on what you want to achieve
https://docs.crewai.com/core-concepts/Tasks/#asynchronous-execution
Hello everyone, just curious to understand. if I want to implement an agent without LLM, is this possible by using crewAI? from the documentation,
which seems, to be, if I want to create an agent that purely needs to query data without LLM, is this possible from crewAI?
Kindly correct me, if I am missing something.
@settur1409 If you don't provide an LLM, it will use OpenAI gpt-4 by default (subject to the OPENAI_MODEL_NAME
env var):
llm: Any = Field(
default_factory=lambda: ChatOpenAI(
model=os.environ.get("OPENAI_MODEL_NAME", "gpt-4")
),
description="Language model that will run the agent.",
)
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
This issue was closed because it has been stalled for 5 days with no activity.
I am using google
gemini model free plan
In the context of avoiding errors like:
2024-05-26 18:26:30,636 - 14200 - before_sleep.py-before_sleep:65 - WARNING: Retrying langchain_google_genai.chat_models._chat_with_retry.<locals>._chat_with_retry in 8.0 seconds as it raised ResourceExhausted: 429 Resource has been exhausted (e.g. check quota)..
I will suggest if you can allow us to isolate LLM from each
Agent
and alsoagent_manager
when using a process ofhierarchical
. This is actually not making me achieve anything using hierarchical process