crewAIInc / crewAI

Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
https://crewai.com
MIT License
19k stars 2.62k forks source link

Crewai is not working as per the role,goal and backstory #1048

Closed HariNuve closed 2 weeks ago

HariNuve commented 1 month ago

My requirement was use CrewAI to build an agent able to breakdown Complex input Questions into simple sub questions.I have followed as per the instructions and maintaing agent's structure specified in your documentation page.But failed to perform as per my requirements.failed sample code is as follows:

llm = Ollama(
    model = "llama2",
    base_url = "http://localhost:11434")

general_agent = Agent(role = "Sub Question Generator",
                      goal = """Break a complex question in to multiple simple sub-questions""",
                      backstory = """You are an excellent english professor able to modify complex question into multiple simple questions in a way that everyone can understand.Do not Answer corresponding to any of the questions""",
                      allow_delegation = False,
                      verbose = True,
                      llm = llm)

task = Task(description="""what is Multiple Sclerosis and what are the symptoms in young adults""",
             agent = general_agent,
             expected_output="String  Answer ")

But the the following (found in your documentation) worked properly to build a calculator like agent 

from crewai import Agent, Task, Crew
from langchain.llms import Ollama
import os
os.environ["OPENAI_API_KEY"] = "NA"

llm = Ollama(
    model = "llama2",
    base_url = "http://localhost:11434")

general_agent = Agent(role = "Math Professor",
                      goal = """Provide the solution to the students that are asking mathematical questions and give them the answer.""",
                      backstory = """You are an excellent math professor that likes to solve math questions in a way that everyone can understand your solution""",
                      allow_delegation = False,
                      verbose = True,
                      llm = llm)

task = Task(description="""what is 3 + 5""",
             agent = general_agent,
             expected_output="A numerical answer.")

crew = Crew(
            agents=[general_agent],
            tasks=[task],
            verbose=2
        )

result = crew.kickoff()

print(result)
pravincoder commented 1 month ago

@HariNuve I believe base_url = "http://localhost:11434/v1"

If this does'nt solve the issue pls drop the error after execution.

zinyando commented 1 month ago

@HariNuve what's the output you are getting and what's the format you expect?

HariNuve commented 1 month ago

@pravincoder Changing the url to http://localhost:11434/v1 didn't worked,experienced an error as follows :


    for chunk in self._stream(
  File "/home/harikrishnan/anaconda3/envs/ml_env/lib/python3.10/site-packages/langchain_community/llms/ollama.py", line 463, in _stream
    for stream_resp in self._create_generate_stream(prompt, stop, **kwargs):
  File "/home/harikrishnan/anaconda3/envs/ml_env/lib/python3.10/site-packages/langchain_community/llms/ollama.py", line 176, in _create_generate_stream
    yield from self._create_stream(
  File "/home/harikrishnan/anaconda3/envs/ml_env/lib/python3.10/site-packages/langchain_community/llms/ollama.py", line 247, in _create_stream
    raise OllamaEndpointNotFoundError(
langchain_community.llms.ollama.OllamaEndpointNotFoundError: Ollama call failed with status code 404. Maybe your model is not found and you should pull the model with `ollama pull llama2`.```

At the same time when i used the " http://localhost:11434 " atleast i got the response.

I have checked the llms by ollama list,and all my packages are updated,yet the modification you suggested in the url didn't worked.
HariNuve commented 1 month ago

@zinyando Iam expecting breaking down complex questions into simple sub questions.What iam getting is answer to those questions

zinyando commented 1 month ago

@zinyando Iam expecting breaking down complex questions into simple sub questions.What iam getting is answer to those questions

@HariNuve I think your issue is related to your prompts not being clear enough for the LLM. I played around with it and got it working. You can definitely improve the output.

from crewai import Agent, Task, Crew
from langchain.llms import Ollama
import os

os.environ["OPENAI_API_KEY"] = "NA"

llm = Ollama(model="llama3.1", base_url="http://localhost:11434")

general_agent = Agent(
    role="Sub Question Generator",
    goal="""Break a complex question in to multiple simple sub-questions""",
    backstory="""You are an excellent english professor able to modify complex question into multiple simple questions in a way that everyone can understand.Do not Answer corresponding to any of the questions""",
    allow_delegation=False,
    verbose=True,
    llm=llm,
)

question = "What is Multiple Sclerosis and what are the symptoms in young adults"

task = Task(
    description=f"Break the following complex question into multiple simple sub-questions: {question}",
    agent=general_agent,
    expected_output="A breakdown of the question without answering it clearly formatted as markdown.",
)

crew = Crew(agents=[general_agent], tasks=[task], verbose=2)

result = crew.kickoff(
    inputs={"question": question},
)

print(result)

Be sure to set the Ollama agent to the one you want to use. An example output I got is below

# Breakdown of Complex Question
## 1. What is Multiple Sclerosis?
### 1.1. Definition
What is the medical definition of Multiple Sclerosis?
### 1.2. Causation
Is multiple sclerosis caused by a viral infection or an autoimmune disorder?
### 1.3. Prevalence
What is the estimated prevalence of multiple sclerosis in young adults?

## 2. What are the Symptoms in Young Adults?
### 2.1. Common Symptoms
What are the most common symptoms experienced by young adults with multiple sclerosis?
### 2.2. Rare Symptoms
Are there any rare or unusual symptoms that can be associated with multiple sclerosis in young adults?
### 2.3. Age-Specific Symptoms
How do the symptoms of multiple sclerosis differ in young adults compared to older adults?

Just so you know, I think these type of questions are better suited to be posted in Discord and not on GitHub issues. You can join it here Join CrewAI Discord