Open twhite54 opened 3 weeks ago
Thanks @twhite54! It seems that you're running through Azure. Could you try running the following program, replacing the key with your Azure key?
import litellm
messages = [
{"role": "system", "content": "Respond in pirate speak."},
{"role": "user", "content": "What is 2+2?"},
]
response = litellm.completion(
api_key="sk-XXX",
model="azure/o1-preview",
messages=messages,
)
print(json.dumps(response.model_dump(), indent=2))
If this fails, then it's probably a problem that should be reported directly to LiteLLM: https://github.com/BerriAI/litellm/
If it works, then we'll need to investigate further, as it may be an error on our side.
Is there an existing issue for the same bug?
Describe the bug and reproduction steps
Run anything with o1-preview
There was an unexpected error while running the agent
litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}
OpenHands Installation
Docker command in README
OpenHands Version
openhands:0.12
Operating System
WSL on Windows
Logs, Errors, Screenshots, and Additional Context
Ubuntu nor Docker had any errors in the logs.
UI Error Message:
There was an unexpected error while running the agent
litellm.BadRequestError: AzureException BadRequestError - Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}