phidatahq / phidata

Build AI Agents with memory, knowledge, tools and reasoning. Chat with them using a beautiful Agent UI.
https://docs.phidata.com
Mozilla Public License 2.0
15.63k stars 2.15k forks source link

phidata/cookbook/agents/03_agent_team.py with openai.OpenAIError #1384

Closed tangd closed 1 day ago

tangd commented 3 weeks ago

I am trying to run the code in phidata/cookbook/agents/03_agent_team.py, and I have changed the model to my local ollama, but there is an error in the code. Please help me fix it.

` from phi.agent import Agent from phi.tools.duckduckgo import DuckDuckGo from phi.tools.yfinance import YFinanceTools from phi.model.ollama import Ollama

web_agent = Agent( name="Web Agent", role="Search the web for information",

model=OpenAIChat(id="gpt-4o"),

model=Ollama(id="llama3.2"),
tools=[DuckDuckGo()],
instructions=["Always include sources"],
show_tool_calls=True,
markdown=True,

)

finance_agent = Agent( name="Finance Agent", role="Get financial data",

model=OpenAIChat(id="gpt-4o"),

model=Ollama(id="llama3.2"),
tools=[YFinanceTools(stock_price=True, analyst_recommendations=True, company_info=True)],
instructions=["Use tables to display data"],
show_tool_calls=True,
markdown=True,

)

agent_team = Agent( team=[web_agent, finance_agent], instructions=["Always include sources", "Use tables to display data"], show_tool_calls=True, markdown=True, )

agent_team.print_response("Summarize analyst recommendations and share the latest news for NVDA", stream=True)

`

error message

/Users/duotang/PycharmProjects/phidata/.venv/bin/python /Users/duotang/PycharmProjects/phidata/cookbook/agents/03_agent_team.py ▰▱▱▱▱▱▱ Thinking... ┏━ Message ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓ ┃ ┃ ┃ Summarize analyst recommendations and share the latest news for NVDA ┃ ┃ ┃ ┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛Traceback (most recent call last): File "/Users/duotang/PycharmProjects/phidata/cookbook/agents/03_agent_team.py", line 36, in agent_team.print_response("Summarize analyst recommendations and share the latest news for NVDA", stream=True) File "/Users/duotang/PycharmProjects/phidata/phi/agent/agent.py", line 2598, in print_response for resp in self.run(message=message, messages=messages, stream=True, kwargs): File "/Users/duotang/PycharmProjects/phidata/phi/agent/agent.py", line 1603, in _run for model_response_chunk in self.model.response_stream(messages=messages_for_model): File "/Users/duotang/PycharmProjects/phidata/phi/model/openai/chat.py", line 803, in response_stream for response in self.invoke_stream(messages=messages): File "/Users/duotang/PycharmProjects/phidata/phi/model/openai/chat.py", line 368, in invoke_stream yield from self.get_client().chat.completions.create( File "/Users/duotang/PycharmProjects/phidata/phi/model/openai/chat.py", line 182, in get_client return OpenAIClient(_client_params) File "/Users/duotang/PycharmProjects/phidata/.venv/lib/python3.10/site-packages/openai/_client.py", line 105, in init raise OpenAIError( openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

manthanguptaa commented 3 weeks ago

@tangd this will get resolved when you put model=Ollama(id="llama3.2") in your agent team leader as well. So your agent leader will look something like this

agent_team = Agent(
model=Ollama(id="llama3.2"),
team=[web_agent, finance_agent],
instructions=["Always include sources", "Use tables to display data"],
show_tool_calls=True,
markdown=True,
)
tangd commented 3 weeks ago

@manthanguptaa Thanks for the quick response.I tried your suggestion, but I received a new error message.

Traceback (most recent call last): File "/Users/duotang/PycharmProjects/phidata/cookbook/agents/03_agent_team.py", line 37, in agent_team.print_response("Summarize analyst recommendations and share the latest news for NVDA", stream=True) File "/Users/duotang/PycharmProjects/phidata/phi/agent/agent.py", line 2598, in print_response for resp in self.run(message=message, messages=messages, stream=True, **kwargs): File "/Users/duotang/PycharmProjects/phidata/phi/agent/agent.py", line 1603, in _run for model_response_chunk in self.model.response_stream(messages=messages_for_model): File "/Users/duotang/PycharmProjects/phidata/phi/model/ollama/chat.py", line 577, in response_stream for response in self.invoke_stream(messages=messages): File "/Users/duotang/PycharmProjects/phidata/phi/model/ollama/chat.py", line 224, in invoke_stream yield from self.get_client().chat( File "/Users/duotang/PycharmProjects/phidata/.venv/lib/python3.10/site-packages/ollama/_client.py", line 85, in _stream raise ResponseError(e.response.text, e.response.status_code) from None ollama._types.ResponseError: json: cannot unmarshal array into Go struct field .tools.function.parameters.properties.type of type string

` from phi.agent import Agent from phi.tools.duckduckgo import DuckDuckGo from phi.tools.yfinance import YFinanceTools from phi.model.ollama import Ollama

web_agent = Agent( name="Web Agent", role="Search the web for information",

model=OpenAIChat(id="gpt-4o"),

model=Ollama(id="llama3.2"),
tools=[DuckDuckGo()],
instructions=["Always include sources"],
show_tool_calls=True,
markdown=True,

)

finance_agent = Agent( name="Finance Agent", role="Get financial data",

model=OpenAIChat(id="gpt-4o"),

model=Ollama(id="llama3.2"),
tools=[YFinanceTools(stock_price=True, analyst_recommendations=True, company_info=True)],
instructions=["Use tables to display data"],
show_tool_calls=True,
markdown=True,

)

agent_team = Agent( model=Ollama(id="llama3.2"), team=[web_agent, finance_agent], instructions=["Always include sources", "Use tables to display data"], show_tool_calls=True, markdown=True, )

agent_team.print_response("Summarize analyst recommendations and share the latest news for NVDA", stream=True)

`

d8rt8v commented 2 weeks ago

Yeah, same error ollama._types.ResponseError: json: cannot unmarshal array into Go struct field .tools.function.parameters.properties.type of type string with this code

from phi.model.ollama import Ollama
from phi.agent import Agent
from phi.tools.duckduckgo import DuckDuckGo
from phi.tools.newspaper4k import Newspaper4k

web_searcher = Agent(
    name="Web Searcher",
    role="Searches the web for information on a topic",
    model=Ollama(id="llama3.2", host='http://192.168.8.210:11434'),
    tools=[DuckDuckGo()],
    add_datetime_to_instructions=True
)

article_reader = Agent(
    name="Article Reader",
    model=Ollama(id="llama3.2", host='http://192.168.8.210:11434'),
    role="Reads articles from URLs.",
    tools=[Newspaper4k()]
)

agent_team = Agent(
    name="Writing Team",
    team=[web_searcher, article_reader],
    model=Ollama(id="llama3.2", host='http://192.168.8.210:11434'),
    instructions=[
        "First, search web for what the user is asking about.",
        "Then, ask the article reader to read the links for the stories to get more information.",
        "Important: you must provide the article reader with the links to read.",
        "Finally, provide a thoughtful and engaging summary.",
    ],
    markdown=True,
    show_tool_calls=True
)

agent_team.print_response("Cybersecurity landscape in 2024")

And 400 error in ollama serve

image

Harry737 commented 2 weeks ago

Hi @tangd, @manthanguptaa

I'm also facing the same issue when using CSV Tools with ollama model. Can I know whether you've found any solution.

ronnie-1205 commented 2 weeks ago

hey @tangd, @d8rt8v, @Harry737 I was also facing the same issue while running the following code so i asked on their discord server and one of the staff said that this is this is a known issue when using Ollama with agent teams and that they are currently working on a fix....

I have asked him if there are any other free models we can use, waiting for his response, will update you when something happens....

image


from phi.model.ollama import Ollama
from phi.tools.duckduckgo import DuckDuckGo
from pathlib import Path
from phi.tools.newspaper4k import Newspaper4k
from phi.tools.file import FileTools

urls_file = Path(__file__).parent.joinpath("tmp", "urls__{session_id}.md")
urls_file.parent.mkdir(parents=True, exist_ok=True)

searcher = Agent(
    name="Searcher",
    model=Ollama(id="llama3.1:8b"),
    role="Searches the top URLs for a topic",
    instructions=[
        "Given a topic, first generate a list of 3 search terms related to that topic.",
        "For each search term, search the web and analyze the results."
        "Return the 10 most relevant URLs to the topic.",
        "You are writing for the New York Times, so the quality of the sources is important.",
    ],
    tools=[DuckDuckGo()],
    save_response_to_file=str(urls_file),
    add_datetime_to_instructions=True,
)

writer = Agent(
    name="Writer",
    model=Ollama(id="llama3.1:8b"),
    role="Writes a high-quality article",
    description=(
        "You are a senior writer for the New York Times. Given a topic and a list of URLs, "
        "your goal is to write a high-quality NYT-worthy article on the topic."
    ),
    instructions=[
        f"First read all urls in {urls_file.name} using `get_article_text`."
        "Then write a high-quality NYT-worthy article on the topic."
        "The article should be well-structured, informative, engaging and catchy.",
        "Ensure the length is at least as long as a NYT cover story -- at a minimum, 15 paragraphs.",
        "Ensure you provide a nuanced and balanced opinion, quoting facts where possible.",
        "Focus on clarity, coherence, and overall quality.",
        "Never make up facts or plagiarize. Always provide proper attribution.",
        "Remember: you are writing for the New York Times, so the quality of the article is important.",
    ],
    tools=[Newspaper4k(), FileTools(base_dir=urls_file.parent)],
    add_datetime_to_instructions=True,
)

editor = Agent(
    name="Editor",
    model=Ollama(id="llama3.1:8b"),
    team=[searcher, writer],
    description="You are a senior NYT editor. Given a list of topics, your goal is to write a NYT worthy articles aout each of them, to be put in an actual newspaper.",
    instructions=[
        "You will be provided with a list of topics for all the differnt sections of the newspapar.",
        "You have to carefully select all the topics from the list one by one.",
        "Once a topic has been selected, First ask the search journalist to search for the most relevant URLs for that topic.",
        "Then ask the writer to get an engaging draft of the article.",
        "Edit, proofread, and refine the article to ensure it meets the high standards of the New York Times.",
        "The article should be extremely articulate and well written. ",
        "Focus on clarity, coherence, and overall quality.",
        "Now once the article is done, i want you to write the article to a file called `draft.txt` in the same directory as this script.",
        "Once all the articles have been done and saved in draft.txt, i want you to take all the articles and reformat all the articles in a newspaper format, dividing the articles into different sections, with various headlines in each section.",
        "Finally, the above reformatted text into a file called final.txt",
        "Remember: you are the final gatekeeper before the newspaper is published, so make sure that all the topics have been covered and that the newspaper is perfect."
    ],
    add_datetime_to_instructions=True,
    markdown=True,
)

editor.print_response("""The list of topics are:
                      Topic 1-) Tech, 
                      Topic 2-) Design, 
                      Topic 3-) India Politics, 
                      Topic 4-) World News, 
                      Topic 5-) Science, 
                      Topic 6-) Business""", stream= True)
manthanguptaa commented 1 day ago

Hey everyone! We released OllamaTools class to fix the issue with Ollama. Here are the associated cookbooks with it https://github.com/phidatahq/phidata/tree/main/cookbook/providers/ollama_tools