All-Hands-AI / OpenHands

🙌 OpenHands: Code Less, Make More
https://all-hands.dev
MIT License
33.47k stars 3.83k forks source link

[Bug]: Agent crashes when running Gemini models #2764

Closed Git-on-my-level closed 3 months ago

Git-on-my-level commented 4 months ago

Is there an existing issue for the same bug?

Describe the bug

The agent crashes when running gemini-1.5-flash with this error:

Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/litellm/llms/vertex_ai.py", line 436, in completion
    import vertexai
ModuleNotFoundError: No module named 'vertexai'

Full stacktrace below

Current OpenDevin version

ghcr.io/opendevin/opendevin:0.7

Installation and Configuration

WORKSPACE_BASE=$(pwd)/workspace
GEMINI_API_KEY="A..."
LLM_MODEL="gemini/gemini-1.5-flash"
docker run -it \
    --pull=always \
    -e SANDBOX_USER_ID=$(id -u) \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -e GEMINI_API_KEY=$GEMINI_API_KEY \
    -e LLM_MODEL=$LLM_MODEL \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    --add-host host.docker.internal:host-gateway \
    --name opendevin-app-$(date +%Y%m%d%H%M%S) \
    ghcr.io/opendevin/opendevin:0.7

Model and Agent

Model: gemini-1.5-flash Agent: CodeActAgent

Operating System

MacOS

Reproduction Steps

  1. Run OpenDevin
  2. Give the agent any command
  3. Get the corresponding error

Logs, Errors, Screenshots, and Additional Context

Full stack

LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/litellm/llms/vertex_ai.py", line 436, in completion
    import vertexai
ModuleNotFoundError: No module named 'vertexai'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2000, in completion
    model_response = vertex_ai.completion(
                     ^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/llms/vertex_ai.py", line 438, in completion
    raise VertexAIError(
litellm.llms.vertex_ai.VertexAIError: vertexai import failed please run `pip install google-cloud-aiplatform`. This is required for the 'vertex_ai/' route on LiteLLM

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/opendevin/controller/agent_controller.py", line 138, in _start_step_loop
    await self._step()
  File "/app/opendevin/controller/agent_controller.py", line 316, in _step
    action = self.agent.step(self.state)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/agenthub/codeact_agent/codeact_agent.py", line 220, in step
    response = self.llm.completion(
               ^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 336, in wrapped_f
    return copy(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 475, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 376, in iter
    result = action(retry_state)
             ^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 398, in <lambda>
    self._add_action_func(lambda rs: rs.outcome.result())
                                     ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 478, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "/app/opendevin/llm/llm.py", line 209, in wrapper
    resp = completion_unwrapped(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 956, in wrapper
    raise e
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 851, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2606, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 7540, in exception_type
    raise e
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 6567, in exception_type
    raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: VertexAIException BadRequestError - vertexai import failed please run `pip install google-cloud-aiplatform`. This is required for the 'vertex_ai/' route on LiteLLM
tobitege commented 4 months ago

Thanks for the report! Looks like the sandbox may have a missing dependency and/or outdated poetry.lock file. We'll investigate! 👍

cc @Shimada666 @mamoodi one of you can put a tracker on this, please? 🤗