The agent crashes when running gemini-1.5-flash with this error:
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/vertex_ai.py", line 436, in completion
import vertexai
ModuleNotFoundError: No module named 'vertexai'
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/vertex_ai.py", line 436, in completion
import vertexai
ModuleNotFoundError: No module named 'vertexai'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2000, in completion
model_response = vertex_ai.completion(
^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/llms/vertex_ai.py", line 438, in completion
raise VertexAIError(
litellm.llms.vertex_ai.VertexAIError: vertexai import failed please run `pip install google-cloud-aiplatform`. This is required for the 'vertex_ai/' route on LiteLLM
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/app/opendevin/controller/agent_controller.py", line 138, in _start_step_loop
await self._step()
File "/app/opendevin/controller/agent_controller.py", line 316, in _step
action = self.agent.step(self.state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/agenthub/codeact_agent/codeact_agent.py", line 220, in step
response = self.llm.completion(
^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 336, in wrapped_f
return copy(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 475, in __call__
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 376, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 398, in <lambda>
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 478, in __call__
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/app/opendevin/llm/llm.py", line 209, in wrapper
resp = completion_unwrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 956, in wrapper
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 851, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2606, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 7540, in exception_type
raise e
File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 6567, in exception_type
raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: VertexAIException BadRequestError - vertexai import failed please run `pip install google-cloud-aiplatform`. This is required for the 'vertex_ai/' route on LiteLLM
Is there an existing issue for the same bug?
Describe the bug
The agent crashes when running
gemini-1.5-flash
with this error:Full stacktrace below
Current OpenDevin version
Installation and Configuration
Model and Agent
Model:
gemini-1.5-flash
Agent:CodeActAgent
Operating System
MacOS
Reproduction Steps
Logs, Errors, Screenshots, and Additional Context
Full stack