BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.45k stars 1.58k forks source link

[Bug]: JSON Mode Fallbacks VertexAI Gemini Failure 500 #5338

Closed umuthopeyildirim closed 2 months ago

umuthopeyildirim commented 2 months ago

What happened?

We recently enabled structured outputs but LiteLLM is not correctly converting OAI json mode schema to VertexAI schema. We use LangChain.js's ChatOpenAI to orchestrate all our requests. Don't know if this is a Feature Request or Bug.

I tried to solve the first problem by deleting $schema AND additionalProperties from payload but that resulted in 500 error.

This is our stack:

Relevant log output

BEFORE DELETING $schema AND additionalProperties
Message: LLM API call failed: `litellm.BadRequestError: VertexAIException BadRequestError - b'{\n  "error": {\n    "code": 400,\n    "message": "Invalid JSON payload received. Unknown name \\"additionalProperties\\" at \'tools[0].function_declarations[0].parameters\': Cannot find field.\\nInvalid JSON payload received. Unknown name \\"$schema\\" at \'tools[0].function_declarations[0].parameters\': Cannot find field.",\n    "status": "INVALID_ARGUMENT",\n    "details": [\n      {\n        "@type": "type.googleapis.com/google.rpc.BadRequest",\n        "fieldViolations": [\n          {\n            "field": "tools[0].function_declarations[0].parameters",\n            "description": "Invalid JSON payload received. Unknown name \\"additionalProperties\\" at \'tools[0].function_declarations[0].parameters\': Cannot find field."\n          },\n          {\n            "field": "tools[0].function_declarations[0].parameters",\n            "description": "Invalid JSON payload received. Unknown name \\"$schema\\" at \'tools[0].function_declarations[0].parameters\': Cannot find field."\n          }\n        ]\n      }\n    ]\n  }\n}\n'

Message: LLM API call failed: `litellm.InternalServerError: VertexAIException InternalServerError - b'{\n  "error": {\n    "code": 500,\n    "message": "An internal error has occurred. Please retry or report in https://developers.generativeai.google/guide/troubleshooting%22,\n    "status": "INTERNAL"\n  }\n}\n'
Model: gemini-1.5-flash-latest`

Twitter / LinkedIn details

https://www.linkedin.com/in/umuthopeyildirim/

krrishdholakia commented 2 months ago

Hi @umuthopeyildirim which version is this on?

and how do i repro the problem?

For context, here's our test for vertex ai w/ json schema which passes - https://github.com/BerriAI/litellm/blob/1765976ce0db8dd434876ed4833f103398dec998/litellm/tests/test_amazing_vertex_completion.py#L1540

umuthopeyildirim commented 2 months ago

Hey @krrishdholakia We just updated to latest LiteLLM docker container image and issue still persists.

This is the code snippet we are using;

const SuggestionsSchema = z.object({
    suggestions: z.array(z.string().describe('A suggestion for the user')),
})

const suggestionsModel = llmModel.chatModel.withStructuredOutput(SuggestionsSchema)

const stream = await suggestionsModel.stream(convertToLangChainMessages(messages))
krrishdholakia commented 2 months ago

I have a repro. I think the issue is the same with tool calling which we had to recently cleanup the input for.

Working on it

krrishdholakia commented 2 months ago

relevant issue on langchain js about this - https://github.com/langchain-ai/langchainjs/issues/5240

umuthopeyildirim commented 2 months ago

Thank you. By the way, we have a dedicated instance at TogetherAI that supports JSON mode. After we enabled enable_json_schema_validation: True, it started falling back to a supported model, which is great. However, we need to add JSON mode support to our dedicated model on LiteLLM. How can we do that?

krrishdholakia commented 2 months ago

TogetherAI that supports JSON mode

got it - i'll add this on our end

I can see they support it now - https://docs.together.ai/docs/json-mode

krrishdholakia commented 2 months ago

tgai json mode + vertex ai additional properties cleanup implemented @umuthopeyildirim

Will be live in today's release

umuthopeyildirim commented 2 months ago

Hi @krrishdholakia, I can still reproduce the issue after updating to the latest litellm Docker image