BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.97k stars 1.65k forks source link

[BUG]: batch_completion() - 'max_workers' Error #5973

Open gauss5930 opened 1 month ago

gauss5930 commented 1 month ago

I was trying to use batch_completion() function with "gpt-4o-mini-2024-07-18". However, the run result showed Error code: 400 - {'error': {'message': 'Unrecognized request argument supplied: max_workers', 'type': 'invalid_request_error', 'param': None, 'code': None}} message. How to resolve this problem?

josearangos commented 1 month ago

@gauss5930 The problem is solven it here pending to approved