phidatahq / phidata

Build AI Agents with memory, knowledge, tools and reasoning. Chat with them using a beautiful Agent UI.
https://docs.phidata.com
Mozilla Public License 2.0
14.66k stars 2.04k forks source link

agent-api fails with gemini models #1347

Open anandanand84 opened 1 week ago

anandanand84 commented 1 week ago

A sample agent with following configuration fails when calling using localhost:8000/docs /run endpoint with the following error.

return Agent(
        name="Gemini Agent",
        agent_id="Gemini-agent",
        session_id=session_id,
        user_id=user_id,
        # The model to use for the agent
        model=Gemini(model_id="gemini-1.5-flash"),
        # Tools available to the agent
        tools=[],
        # A description of the agent that guides its overall behavior
        description="You are a highly advanced AI agent with access to an extensive knowledge base and powerful web-search capabilities.",
        # A list of instructions to follow, each as a separate item in the list
        instructions=[
            "Always search your knowledge base first.\n"
            "  - Search your knowledge base before seeking external information.\n"
            "  - Provide answers based on your existing knowledge whenever possible.",
            "Then search the web if no information is found in your knowledge base.\n"
            "  - If the information is not available in your knowledge base, use `duckduckgo_search` to find relevant information.",
            "Provide concise and relevant answers.\n"
            "  - Keep your responses brief and to the point.\n"
            "  - Focus on delivering the most pertinent information without unnecessary detail.",
            "Ask clarifying questions.\n"
            "  - If a user's request is unclear or incomplete, ask specific questions to obtain the necessary details.\n"
            "  - Ensure you fully understand the inquiry before formulating a response.",
            "Verify the information you provide for accuracy.",
            "Cite reliable sources when referencing external data.",
        ],
        # Format responses as markdown
        markdown=True,
        # Show tool calls in the response
        show_tool_calls=True,
        # Add the current date and time to the instructions
        add_datetime_to_instructions=True,
        # Store agent sessions in the database
        storage=example_agent_storage,
        # Enable read the chat history from the database
        read_chat_history=True,
        # Store knowledge in a vector database
        # knowledge=example_agent_knowledge,
        # Enable searching the knowledge base
        # search_knowledge=True,
        # Enable monitoring on phidata.app
        monitoring=True,
        # Show debug logs
        debug_mode=debug_mode,
    )
DEBUG    --**-- Logging Agent Run                                               
DEBUG    Could not create Agent run: Object of type RepeatedComposite is not    
         JSON serializable                                                      
DEBUG    *********** Agent Run End: 91258c3e-4ce3-46e9-b52e-3809837dd8ed        
         ***********                                                            
INFO:     172.19.0.1:47964 - "POST /v1/playground/agent/run HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/usr/local/lib/python3.12/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/usr/local/lib/python3.12/site-packages/starlette/applications.py", line 113, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 187, in __call__
    raise exc
  File "/usr/local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 165, in __call__
    await self.app(scope, receive, _send)
  File "/usr/local/lib/python3.12/site-packages/starlette/middleware/cors.py", line 93, in __call__
    await self.simple_response(scope, receive, send, request_headers=headers)
  File "/usr/local/lib/python3.12/site-packages/starlette/middleware/cors.py", line 144, in simple_response
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 715, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 735, in app
    await route.handle(scope, receive, send)
  File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 288, in handle
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 76, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.12/site-packages/starlette/routing.py", line 73, in app
    response = await f(request)
               ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/fastapi/routing.py", line 301, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/fastapi/routing.py", line 214, in run_endpoint_function
    return await run_in_threadpool(dependant.call, **values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/starlette/concurrency.py", line 39, in run_in_threadpool
    return await anyio.to_thread.run_sync(func, *args)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/anyio/to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 2441, in run_sync_in_worker_thread
    return await future
           ^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 943, in run
    result = context.run(func, *args)
             ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/phi/playground/router.py", line 138, in agent_run
    return run_response.model_dump_json()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/pydantic/main.py", line 441, in model_dump_json
    return self.__pydantic_serializer__.to_json(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
pydantic_core._pydantic_core.PydanticSerializationError: Unable to serialize unknown type: <class 'proto.marshal.collections.repeated.RepeatedComposite'>
manthanguptaa commented 1 week ago

Hey @anandanand84, can you share the complete code? I will try my best to help you with this issue

anandanand84 commented 5 days ago

@manthanguptaa you can reproduce by just changing the model to Gemini 1.5 flash as shown above after creating a new project using phi ws up --group api. You can try using the swagger and you will be able to reproduce. it doesnt work with any other model other than open ai.

anandanand84 commented 5 days ago

if you want an simpler example. create a file main.py with contents

from phi.agent import Agent  # noqa
#from phi.model.ollama import Ollama
from phi.playground import Playground, serve_playground_app
from phi.storage.agent.sqlite import SqlAgentStorage
from phi.model.google import Gemini
from phi.model.groq import Groq
from phi.model.openai import OpenAIChat
from phi.model.openai.like import OpenAILike
from phi.model.groq import Groq
from phi.model.sambanova import Sambanova

example = Agent(
    name="example",
    agent_id="exmaple",
    # model=Sambanova(id="Meta-Llama-3.2-3B-Instruct"), # Not working, doesn't follow tool call and redirect to other agents
    # model=Groq(id="llama3-groq-70b-8192-tool-use-preview"),
    # model=Ollama(id="llama3.1"),
    model=Gemini(id="gemini-1.5-flash"),#gemini-1.5-pro-exp-0827
    # model = OpenAIChat(id="gpt-4o"),
    # model = OpenAILike(
    #     id="mistral-nemo:latest",
    #     base_url="http://localhost:11434/v1",
    # ),
    description="You are a helpful assistant that can answer questions.",
    instructions=[],
    add_history_to_messages=True,
    stream=False,
    # additional_context=files.read_file('./prompts/router.context.md'),
    storage=SqlAgentStorage(table_name="example_agent", db_file="agents.db"),
    show_tool_calls=True,
    debug_mode=True,
)

app = Playground(agents=[example]).get_app()

if __name__ == "__main__":
    serve_playground_app("main:app", reload=True)

pip install phidata==2.5.3 python main.py

Try it from the playground https://www.phidata.app/playground?port=7777&endpoint=localhost%253A7777&agent=exmaple

you will get error