Closed li-dennis closed 4 months ago
@li-dennis can you share the request you're sending the LiteLLM Proxy server so I can repro it
if you run the proxy with --detailed_debug
you can see the raw request being made by litellm @li-dennis
Sorry about the delay -- looked into the actual tool arguments getting passed along to vertex
...
tool_choice: "auto",
tools: [
{
type: "function",
function: {
name: "foo",
description: "bar",
parameters: {
type: "object",
properties: {
args: {
additionalProperties: False,
properties: {
x: { title: "X", type_: "NUMBER" },
},
required: [
"x",
],
title: "Args",
type_: "OBJECT"
}
},
required: ["args"]
}
}
}
]
...
In this case, the args
is a json schema generated via pydantic, and it includes an additionalProperties: False
field. I'd guess that openai/anthropic APIs are OK with json schemas with additionalProperties, but vertex is not
that's helpful for repro - thanks @li-dennis
planning to have this fixed by tomorrow
Trying to monkey patch a solution as well and it looks like vertex AI might only support a subset of json schema
VertexAIException - Protocol message Schema has no "anyOf"
Trying to monkey patch a solution as well and it looks like vertex AI might only support a subset of json schema
VertexAIException - Protocol message Schema has no "anyOf"
"Vertex AI offers limited support of the OpenAPI schema. The following attributes are supported: type, nullable, required, format, description, properties, items, enum. The following attributes are not supported: default, optional, maximum, oneOf." https://cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling#function-declarations
Also running into other issues which look like they might just be limitations of vertex AI itself:
VertexAIException - 400 Unable to submit request because Function Calling is not supported with non-text input. Remove the function declarations or remove inline_data/file_data from contents
@li-dennis unable to repro your issue
load_vertex_ai_credentials()
litellm.set_verbose = True
messages = [
{
"role": "system",
"content": "Your name is Litellm Bot, you are a helpful assistant",
},
# User asks for their name and weather in San Francisco
{
"role": "user",
"content": "Hello, what is your name and can you tell me the weather?",
},
]
class Args(BaseModel):
x: float = Field(..., title="X")
class Config:
title = "Args"
schema_extra = {
"type": "object",
"properties": {"x": {"type": "number", "title": "X"}},
"required": ["x"],
"additionalProperties": False,
}
# Generate JSON schema
json_schema = Args.schema()
print(f"json_schema: {json_schema}")
tools = [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get the current weather in a given location",
"parameters": json_schema,
},
}
]
data = {
"model": "vertex_ai/gemini-1.5-pro-preview-0514",
"messages": messages,
"tools": tools,
}
if sync_mode:
response = litellm.completion(**data)
else:
response = await litellm.acompletion(**data)
print(f"response: {response}")
I also tried running it with a bool
value
e.g.
class Args(BaseModel):
x: float = Field(..., title="X")
is_true: bool # 👈 KEY CHANGE
class Config:
title = "Args"
schema_extra = {
"type": "object",
"properties": {"x": {"type": "number", "title": "X"}},
"required": ["x"],
"additionalProperties": False,
}
the correct json schema output should be:
json_schema: {'properties': {'x': {'title': 'X', 'type': 'number'}, 'is_true': {'title': 'Is True', 'type': 'boolean'}}, 'required': ['x', 'is_true'], 'title': 'Args', 'type': 'object'}
note how is_true = {'title': 'Is True', 'type': 'boolean'}}
That works too!
Closing as i'm unable to repro this. @li-dennis please bump if you're able to share a consistent repro for your problem with a code snippet we can use for testing
Hi @li-dennis can we setup a 1:1 slack support channel for your team - will help us resolve this issue faster if you get a consistent repro
I dm'ed you on linkedin @li-dennis Here's our discord if that's easier https://discord.com/invite/wuPM9dRgDw
What happened?
Hi all,
Thanks for the great work on litellm! Apologies if this isn't a bug and just user error, but I've been unsure of what to make of this exception. I've been trying to get gemini via the vertexai api to play nice with litellm. However, I run into this issue pretty quickly:
I'm using litellm proxy (v1.36.1) with the openai api python sdk (1.30.4).
This is my litellm config
I haven't changed my messages in any way, and my requests work just fine with openai/anthropic models. Perhaps the error is due to missing
additionalProperties
in the tool call schemas? Or from https://cloud.google.com/vertex-ai/docs/ml-metadata/system-schemas ?Relevant log output
No response
Twitter / LinkedIn details
No response