BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.42k stars 1.44k forks source link

[Bug]: `stream_options` with fake streaming #5803

Closed Clad3815 closed 4 days ago

Clad3815 commented 4 days ago

What happened?

When using fake stream with o1 we got this problem

stream_options: { "include_usage": true },
stream: true,

Relevant log output

error: {
    message: `litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "The 'stream_options' parameter is only allowed when 'stream' is enabled.", 'type': 'invalid_request_error', 'param': 'stream_options', 'code': None}}\n` +
      'Received Model Group=o1-mini\n' +
      'Available Model Group Fallbacks=None',
    type: null,
    param: null,
    code: '400'
  },

Twitter / LinkedIn details

No response

krrishdholakia commented 4 days ago

Able to repro issue. Fix pushed here - https://github.com/BerriAI/litellm/commit/2523e83c5f18fb6bbada6e2f7e6b358edb3e0277

should be live in prod by EOD.