Closed Clad3815 closed 4 days ago
When using fake stream with o1 we got this problem
stream_options: { "include_usage": true }, stream: true,
error: { message: `litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "The 'stream_options' parameter is only allowed when 'stream' is enabled.", 'type': 'invalid_request_error', 'param': 'stream_options', 'code': None}}\n` + 'Received Model Group=o1-mini\n' + 'Available Model Group Fallbacks=None', type: null, param: null, code: '400' },
No response
Able to repro issue. Fix pushed here - https://github.com/BerriAI/litellm/commit/2523e83c5f18fb6bbada6e2f7e6b358edb3e0277
should be live in prod by EOD.
What happened?
When using fake stream with o1 we got this problem
Relevant log output
Twitter / LinkedIn details
No response