All-Hands-AI / OpenHands

🙌 OpenHands: Code Less, Make More
https://all-hands.dev
MIT License
31.37k stars 3.62k forks source link

[Bug]: chatting with assistant broke #3874

Open gaord opened 3 days ago

gaord commented 3 days ago

Is there an existing issue for the same bug?

Describe the bug

when chatting with assistant, I always get the following error:

Agent encountered an error while processing the last action. Error: APIError: litellm.APIError: APIError: OpenAIException - 'str' object has no attribute 'model_dump' Please try again.

Current OpenHands version

0.9

Installation and Configuration

as in quick start guide:
export WORKSPACE_BASE=$(pwd)/workspace

docker run -it --pull=always \
    -e SANDBOX_RUNTIME_CONTAINER_IMAGE=ghcr.io/all-hands-ai/runtime:0.9-nikolaik \
    -e SANDBOX_USER_ID=$(id -u) \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    --add-host host.docker.internal:host-gateway \
    --name openhands-app-$(date +%Y%m%d%H%M%S) \
    ghcr.io/all-hands-ai/openhands:0.9

Model and Agent

gpt-4 with proxy, codeactagent

Operating System

No response

Reproduction Steps

No response

Logs, Errors, Screenshots, and Additional Context

No response

gaord commented 3 days ago

this can go around with base url setting to https://yourhost/v1. As with openai proxy, v1 is needed for base url.

mamoodi commented 3 days ago

Hi gaord! Just want to understand, you are saying if you set the Base URL, it works? Or it still doesn't work?

If you have a proxy setup, the Base URL must be specified.

gaord commented 8 hours ago

it works

tobitege commented 8 hours ago

@gaord do you have by any chance any more logs from the container with that error message?

mamoodi commented 8 hours ago

If you are running a proxy, you must set a base URL. See docs: https://docs.all-hands.dev/modules/usage/llms/openai-llms#using-an-openai-proxy