Open gaord opened 3 days ago
this can go around with base url setting to https://yourhost/v1. As with openai proxy, v1 is needed for base url.
Hi gaord! Just want to understand, you are saying if you set the Base URL, it works? Or it still doesn't work?
If you have a proxy setup, the Base URL must be specified.
it works
@gaord do you have by any chance any more logs from the container with that error message?
If you are running a proxy, you must set a base URL. See docs: https://docs.all-hands.dev/modules/usage/llms/openai-llms#using-an-openai-proxy
Is there an existing issue for the same bug?
Describe the bug
when chatting with assistant, I always get the following error:
Agent encountered an error while processing the last action. Error: APIError: litellm.APIError: APIError: OpenAIException - 'str' object has no attribute 'model_dump' Please try again.
Current OpenHands version
Installation and Configuration
Model and Agent
gpt-4 with proxy, codeactagent
Operating System
No response
Reproduction Steps
No response
Logs, Errors, Screenshots, and Additional Context
No response