All-Hands-AI / OpenHands

🙌 OpenHands: Code Less, Make More
https://all-hands.dev
MIT License
31.35k stars 3.62k forks source link

The command line is working with Gemini AI #716

Closed timucindusunur closed 5 months ago

timucindusunur commented 5 months ago

PYTHONPATH="./" python opendevin/main.py -d ./workspace/ -i 100 -t "Write a bash script that prints hello world"

The command line is working with Gemini AI, but it gives an error during installation. What could be the reason for this?

config.toml

LLM_MODEL="gemini/gemini-pro" LLM_API_KEY="A....." WORKSPACE_DIR="./workspace"

foragerr commented 5 months ago

but it gives an error during installation.

What's the error? Please fill in the bug template when creating issues.

timucindusunur commented 5 months ago

(OpenDevin) root@DESKTOP-1TC1K9D:/home/timuc/OpenDevin# make run Running the app... /bin/sh: 1: [: unexpected operator make[1]: Entering directory '/home/timuc/OpenDevin' Starting backend... make[1]: Entering directory '/home/timuc/OpenDevin' Starting frontend...

opendevin-frontend@0.1.0 start vite

VITE v5.2.8 ready in 641 ms

➜ Local: http://localhost:3001/ ➜ Network: use --host to expose INFO: Started server process [51464] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:3000 (Press CTRL+C to quit) INFO: 127.0.0.1:57184 - "GET /litellm-models HTTP/1.1" 200 OK INFO: 127.0.0.1:57188 - "GET /litellm-agents HTTP/1.1" 200 OK INFO: 127.0.0.1:57202 - "GET /litellm-models HTTP/1.1" 200 OK INFO: 127.0.0.1:57208 - "GET /litellm-agents HTTP/1.1" 200 OK INFO: ('127.0.0.1', 57220) - "WebSocket /ws" [accepted] INFO: connection open

============== STEP 0

PLAN: bana mali danışmanlık firması için web sitesi tasarla ve çalıştır. Detaylı ve kapsamlı bir çalışma olsun.

INFO: HINT: You're not currently working on any tasks. Your next action MUST be to mark a task as in_progress.

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

timucindusunur commented 5 months ago

Queries made here for "http://localhost:3001" crash and do not return any results. There is no movement other than the background output I gave above.

foragerr commented 5 months ago

I can repro this issue.

steps:

Set Env vars:

LLM_MODEL="gemini/gemini-pro"
LLM_API_KEY="A....."

Run

PYTHONPATH="./" python opendevin/main.py -d ./workspace/ -i 100 -t "Write a bash script that prints hello world"

This works, with output of the form

Running agent MonologueAgent (model: gemini/gemini-pro, directory: ./workspace/) with task: "Write a bash script that prints hello world"

==============
STEP 0

PLAN:
Write a bash script that prints hello world

ACTION:
CmdRunAction(command='echo "hello world"', background=False, action='run')

OBSERVATION:
hello world

==============
STEP 1
...

Then start server with make run and put Write a bash script that prints hello world into the prompt. results in, among other logs

litellm.exceptions.RateLimitError: VertexAIException - vertexai import failed please run `pip install google-cloud-aiplatform`

I ran pip install google-cloud-aiplatform and restarted the server, got a different error

VertexAIException - Your default credentials were not found. To set up Application Default Credentials, see https://cloud.google.com/docs/authentication/external/set-up-adc for more information.
foragerr commented 5 months ago

My best guess is LLM_MODEL="gemini/gemini-pro" which the command line invocation uses, is not the same as gemini-pro set from the UI.

Screenshot 2024-04-04 at 8 10 35 PM
foragerr commented 5 months ago

https://github.com/OpenDevin/OpenDevin/pull/654 fixes this issue.

Well, merging 654, and selecting gemini/gemini-pro from the model drop down box as opposed to gemini-pro because both entries are available.

foragerr commented 5 months ago

@rbren would you mind merging #654 when you have a chance please.

jpshack-at-palomar commented 5 months ago

@foragerr You are exactly right about there being an issue gemini/gemini-pro vs gemini-pro. See my comments here https://github.com/OpenDevin/OpenDevin/issues/653#issuecomment-2035536785 and following.

foragerr commented 5 months ago

654 is now merged.

@jpshack-at-palomar gemini/gemini-pro should be available from the UI dropdown now.

rbren commented 5 months ago

Going to close this one, but we definitely need some better model management. Thanks all!

dorbanianas commented 5 months ago

So basically you can check the Pre-requisites section from litellm documentation now Gemini Pro is working 😁