BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.71k stars 1.61k forks source link

[Bug] o1-preview Error: "Unsupported value: 'messages[0].role' does not support 'system' #5671

Closed carlosp420 closed 1 month ago

carlosp420 commented 1 month ago

What happened?

Running this code:

from litellm import completion

messages = [
            {"role": "system", "content": system_message},
            {"role": "user", "content": question},
        ]
completion(model="openai/o1-preview", messages=messages, temperature=0.0)

generates this error:

BadRequestError: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "Unsupported value: 'messages[0].role' does not support 'system' with this model.", 'type': 'invalid_request_error', 'param': 'messages[0].role', 'code': 'unsupported_value'}}

It did not happen when using other openai models.

Manouchehri commented 1 month ago

This is not a bug, it's a limitation of OpenAI right now.

Manouchehri commented 1 month ago

https://platform.openai.com/docs/guides/reasoning#:~:text=system%20messages%20are%20not%20supported

ishaan-jaff commented 1 month ago

fixed here: https://github.com/BerriAI/litellm/pull/5666