Closed jnunderwood closed 3 months ago
@jnunderwood Thank you for reaching out to us.
Do you use the model gpt-3.5-turbo-0613
or a newer model version with a suffix ending with -0613
for function_call?
If you are using Azure OpenAI service, I would recommend deploying a version with at least version gpt-3.5-turbo-0613
or gpt-4-0613
for using the function_call
feature; for non-azure OpenAI, i would try with model version with a suffix ending -0613
.
thanks
Hi @jnunderwood. Thank you for opening this issue and giving us the opportunity to assist. To help our team better understand your issue and the details of your scenario please provide a response to the question asked above or the information requested above. This will help us more accurately address your issue.
Do you use the model
gpt-3.5-turbo-0613
or a newer model version with a suffix ending with-0613
for function_call? If you are using Azure OpenAI service, I would recommend deploying a version with at least versiongpt-3.5-turbo-0613
orgpt-4-0613
for using thefunction_call
feature; for non-azure OpenAI, i would try with model version with a suffix ending-0613
.
I'm not trying to use any new features in beta3, I'm just trying to upgrade the library to the latest version. Do I still need to use a newer model with beta3?
Do you use the model
gpt-3.5-turbo-0613
or a newer model version with a suffix ending with-0613
for function_call? If you are using Azure OpenAI service, I would recommend deploying a version with at least versiongpt-3.5-turbo-0613
orgpt-4-0613
for using thefunction_call
feature; for non-azure OpenAI, i would try with model version with a suffix ending-0613
.I'm not trying to use any new features in beta3, I'm just trying to upgrade the library to the latest version. Do I still need to use a newer model with beta3?
Thank you for the reply. You don't need to use a newer model with beta.3 for the exiting feature.
I have tried to reproduce it with the code snippet you shared in this sample, but I won't able to reproduce it locally. Can you share a more detail sample?
Hi @jnunderwood. Thank you for opening this issue and giving us the opportunity to assist. To help our team better understand your issue and the details of your scenario please provide a response to the question asked above or the information requested above. This will help us more accurately address your issue.
Here is a longer code snippet. Note...
chat()
method needs to be called from another class and must be passed a list of chat messages. However, the content of those messages should not matter, e.g., [{"role": "user", "content": "hello"}]
.myazurekey
and myendpoint
with values you can use.public class ChatService {
protected OpenAIClient client;
public ChatService() {
this.client = new OpenAIClientBuilder()
.credential(new AzureKeyCredential("myazurekey"))
.endpoint("https://myendpoint.openai.azure.com")
.buildClient();
}
public ChatCompletions chat(List<ChatMessage> messages) {
ChatCompletionsOptions chatOptions = getChatCompletionsOptions(messages);
ChatCompletions completions = this.client.getChatCompletions("gpt-35-turbo", chatOptions);
return completions;
}
protected ChatCompletionsOptions getChatCompletionsOptions(List<ChatMessage> messages) {
ChatCompletionsOptions chatOptions = new ChatCompletionsOptions(messages);
chatOptions.setMaxTokens(512);
chatOptions.setTemperature(0.1);
chatOptions.setTopP(1.0);
chatOptions.setLogitBias(new HashMap<>());
chatOptions.setN(1);
chatOptions.setFrequencyPenalty(0);
chatOptions.setPresencePenalty(0);
return chatOptions;
}
}
@jnunderwood Thank you.
I still can't reproduce it in my local with the shared code. We can try to delete and reload the beta.3 in your .m2 repository to see if that helps.
I tried using below sample code in a new application.
public class ChatService {
protected OpenAIClient client;
public ChatService() {
String azureOpenaiKey = Configuration.getGlobalConfiguration().get("AZURE_OPENAI_KEY");
String endpoint = Configuration.getGlobalConfiguration().get("AZURE_OPENAI_ENDPOINT");
this.client = new OpenAIClientBuilder()
.credential(new AzureKeyCredential(azureOpenaiKey))
.endpoint(endpoint)
.buildClient();
}
public ChatCompletions chat(List<ChatMessage> messages) {
ChatCompletionsOptions chatOptions = getChatCompletionsOptions(messages);
ChatCompletions completions = this.client.getChatCompletions("gpt-35-turbo", chatOptions);
return completions;
}
protected ChatCompletionsOptions getChatCompletionsOptions(List<ChatMessage> messages) {
ChatCompletionsOptions chatOptions = new ChatCompletionsOptions(messages);
chatOptions.setMaxTokens(512);
chatOptions.setTemperature(0.1);
chatOptions.setTopP(1.0);
chatOptions.setLogitBias(new HashMap<>());
chatOptions.setN(1);
chatOptions.setFrequencyPenalty(0.0);
chatOptions.setPresencePenalty(0.0);
return chatOptions;
}
}
public class Beta3 {
public static void main(String[] args) {
ChatService chatService = new ChatService();
List<ChatMessage> chatMessages = new ArrayList<>();
chatMessages.add(new ChatMessage(ChatRole.USER, "hello"));
ChatCompletions chatCompletions = chatService.chat(chatMessages);
System.out.printf("Model ID=%s is created at %s.%n", chatCompletions.getId(), chatCompletions.getCreatedAt());
for (ChatChoice choice : chatCompletions.getChoices()) {
ChatMessage message = choice.getMessage();
System.out.printf("Index: %d, Chat Role: %s.%n", choice.getIndex(), message.getRole());
System.out.println("Message:");
System.out.println(message.getContent());
}
System.out.println();
CompletionsUsage usage = chatCompletions.getUsage();
System.out.printf("Usage: number of prompt token is %d, "
+ "number of completion token is %d, and number of total tokens in request and response is %d.%n",
usage.getPromptTokens(), usage.getCompletionTokens(), usage.getTotalTokens());
}
}
Hi @jnunderwood. Thank you for opening this issue and giving us the opportunity to assist. To help our team better understand your issue and the details of your scenario please provide a response to the question asked above or the information requested above. This will help us more accurately address your issue.
I managed to get your code sample running. It still fails in the same way for me, which makes me think that we may be using different models. We are using the following:
Here is an example of using curl
on the command-line to call our Azure OpenAI endpoint:
curl -s "https://MY_ENDPOINT.openai.azure.com/openai/deployments/gpt-35-turbo/chat/completions?api-version=2023-03-15-preview" \
-H "Content-Type: application/json" \
-H "api-key: MY_API_KEY" \
-d '{"messages":[{"role": "system", "content": "You are a helpful assistant.", "function_call": "none"},{"role": "user", "content": "Hello", "function_call": "none"}]}'
This is the response:
{
"error": {
"message": "Additional properties are not allowed ('function_call' was unexpected) - 'messages.0'",
"type": "invalid_request_error",
"param": null,
"code": null
}
}
As you can see, the response is the same as when using the beta3 release. It appears that beta3 is adding function_call
to the parameter list even when it is not being used.
So... it looks like there are two options:
function_call
parameter when it is not setI will certainly pursue option 1 on my side, but I'm not in charge of our model. Can you look into option 2? It would make upgrades more seamless.
P.S., Thank you very much for you help so far. It is really appreciated!
Ok, so we were able to update our gpt-35-turbo
model to version 0613
. However, I'm still getting a (slightly different) error:
com.azure.core.exception.HttpResponseException: Status code 400, "{
"error": {
"message": "None is not of type 'string' - 'messages.0.name'",
"type": "invalid_request_error",
"param": null,
"code": null
}
}
"
It doesn't seem to matter if I call chatOptions.setFunctionCall(FunctionCallConfig.NONE
) or not.
If I call the API using curl
, and if I use the following body data, it works:
{
"messages": [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "assistant", "content": "Please ask a question.", "function_call": {"name": "none", "arguments": ""}},
{"role": "user", "content": "Hello."}
]
}
It looks like the function_call
parameter is only valid for assistant
roles, according to an earlier error message: "Invalid parameter: only messages with role 'assistant' can have a function call."
api-version=2023-03-15-preview
We no longer support Azure OpenAI service API version 2023-03-15-preview
. I would recommend using the latest service version instead. Backward compatibility is supported. By default, the client always uses the latest version if not specified when building the client.
For
2. You can update the Azure OpenAI code to not add the function_call parameter when it is not set
Thanks for the suggestion. I would take looking into it.
Oops, my bad. I retested using api-version=2023-07-01-preview
and I still get this error when using our new gpt-35-turbo
model version 0613
.
com.azure.core.exception.HttpResponseException: Status code 400, "{
"error": {
"message": "None is not of type 'string' - 'messages.0.name'",
"type": "invalid_request_error",
"param": null,
"code": null
}
}
"
-d '{"messages":[{"role": "system", "content": "You are a helpful assistant.", "function_call": "none"},{"role": "user", "content": "Hello", "function_call": "none"}]}'
You should not add "function_call" in each
message
FWIW, if you are interested in usingfunction_call
, please check out the sample which is equivalent to the below curl.
curl --location --request POST 'https://XXXX.openai.azure.com/openai/deployments/gpt-4-0613/chat/completions?api-version=2023-07-01-preview' --header 'Content-Type: application/json' --header 'api-key: XXXXXX' -d '{"messages":[{"role":"user","content":"What should I wear in Boston depending on the weather?"}],"functions":[{"name":"getCurrentWeather","description":"Get the current weather","parameters":{"type":"object","required":["location","unit"],"properties":{"unit":{"type":"string","enum":["celsius","fahrenheit"]},"location":{"description":"The city and state, e.g. San Francisco, CA","type":"string"}}}}],"function_call":"auto"}'
For this suggestion, I checked the source code and request, and SDK seems doesn't add the function_call
parameter in the request if not set.
You can update the Azure OpenAI code to not add the function_call parameter when it is not set
Do you find different behavior?
The request record I catch by using SDK
{
"az.sdk.message": "HTTP request",
"method": "POST",
"url": "http://localhost:5000/openai/deployments/gpt-35-turbo/chat/completions?api-version=2023-07-01-preview",
"tryCount": "1",
"Date": "Tue, 25 Jul 2023 20:39:06 GMT",
"Content-Length": "291",
"api-key": "REDACTED",
"x-recording-upstream-base-uri": "REDACTED",
"x-recording-mode": "REDACTED",
"Content-Type": "application/json",
"x-recording-id": "REDACTED",
"x-ms-client-request-id": "38ac2b96-e201-418f-bf7b-0cf9e896fc63",
"accept": "application/json",
"contentLength": 291,
"body": "{\"messages\":[{\"role\":\"system\",\"content\":\"You are a helpful assistant. You will talk like a pirate.\"},{\"role\":\"user\",\"content\":\"Can you help me?\"},{\"role\":\"assistant\",\"content\":\"Of course, me hearty! What can I do for ye?\"},{\"role\":\"user\",\"content\":\"What's the best way to train a parrot?\"}]}"
}
I also tried with the provided curl (without parameter function_call
), and it works.
curl --location --request POST 'https://XXXX.openai.azure.com/openai/deployments/gpt-35-turbo/chat/completions?api-version=2023-07-01-preview' --header 'Content-Type: application/json' --header 'api-key: AAAAAAAA' -d '{"messages":[{"role": "system", "content": "You are a helpful assistant."},{"role": "user", "content": "Hello"}]}'
Curl Response:
{"id":"chatcmpl-7gJENiY9rQc7slpthWnYibpf5Yf0v","object":"chat.completion","created":1690318055,"model":"gpt-35-turbo","prompt_annotations":[{"prompt_index":0,"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}}}],"choices":[{"index":0,"finish_reason":"stop","message":{"role":"assistant","content":"Hello! How can I assist you today?"},"content_filter_results":{"hate":{"filtered":false,"severity":"safe"},"self_harm":{"filtered":false,"severity":"safe"},"sexual":{"filtered":false,"severity":"safe"},"violence":{"filtered":false,"severity":"safe"}}}],"usage":{"completion_tokens":9,"prompt_tokens":20,"total_tokens":29}}
Hi @jnunderwood. Thank you for opening this issue and giving us the opportunity to assist. To help our team better understand your issue and the details of your scenario please provide a response to the question asked above or the information requested above. This will help us more accurately address your issue.
I've looked over the code using the IDEA debugger. I think this is what is being sent to the Azure OpenAI endpoint:
{"messages":[{"role":"system","content":"You are a personal assistant.","name":null,"function_call":null},{"role":"user","content":"hi","name":null,"function_call":null}],"max_tokens":4096,"temperature":0.75,"top_p":1.0,"logit_bias":{},"user":"JUnderwo","n":1,"stop":null,"presence_penalty":0.0,"frequency_penalty":0.0,"stream":null,"model":null,"functions":null,"function_call":null}
As you can see, function_call
has been added to the JSON body for some reason. I'm not sure where this happening, but possibly during the serialization of ChatMessage.java
:
@Generated
@JsonProperty(value = "function_call")
private FunctionCall functionCall;
Question... How are you capturing the HTTP data being sent to the endpoint? Knowing this may help me debug the issue on my side. Thanks.
Hi, I have been working on something that uses Azure OpenAI and Cognitive Search as part of a Msft Hackathon and have been running into the same issue.
Specifically getting this error:
{
"error": {
"message": "None is not of type 'string' - 'messages.0.name'",
"type": "invalid_request_error",
"param": null,
"code": null
}
}
I don't get it when just using com.azure:azure-ai-openai:1.0.0-beta.3
dependency on its own.
But, something goes wrong as soon as I add another stable SDK. In my case it was the com.azure:azure-search-documents:11.5.9
.
Interesting observation. I include com.azure:azure-search-documents:11.6.0-beta.7
in my dependencies in addition to com.azure:azure-core:1.41.0
and com.azure:azure-ai-openai:1.0.0-beta.3
.
Interesting observation. I include
com.azure:azure-search-documents:11.6.0-beta.7
in my dependencies in addition tocom.azure:azure-core:1.41.0
andcom.azure:azure-ai-openai:1.0.0-beta.3
.
Possibly a typical Java transitive dependency issue.
@jnunderwood @speza
Thank you for providing this information. I can reproduce it now. After investigating, it turns out it is incompatible between our azure-core serialization and azure-search-documents
's dependency azure-core-serializer-json-jackson
.
There is a temporary solution, which is to exclude the azure-core-serializer-json-jackson
if it doesn't impact your application.
<dependencies>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-ai-openai</artifactId>
<version>1.0.0-beta.3</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-search-documents</artifactId>
<version>11.6.0-beta.7</version>
<exclusions>
<exclusion>
<groupId>com.azure</groupId>
<artifactId>azure-core-serializer-json-jackson</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
@jnunderwood You can add the below method to the client builder and add env var "AZURE_LOG_LEVEL" see how to logging.
.httpLogOptions(new HttpLogOptions().setLogLevel(HttpLogDetailLevel.BODY_AND_HEADERS))
The work-around seems to be working for me. Thank you!
Just wanted to see if this issue will be fixed soon. The "exclusion" workaround still works, but it would be nice to not have to use it. Thank you.
@mssfang I was facing exactly same problem .
Provided workaround worked for me also. My code looks like:
OpenAIClient client = clientFactory.getOpenAIClient();
List<ChatRequestMessage> chatMessages = new ArrayList<>();
chatMessages.add(new ChatRequestSystemMessage("You are a helpful assistant. You will talk like a pirate."));
chatMessages.add(new ChatRequestUserMessage("Can you help me?"));
chatMessages.add(new ChatRequestAssistantMessage("Of course, me hearty! What can I do for ye?"));
chatMessages.add(new ChatRequestUserMessage("What's the best way to train a parrot?"));
ChatCompletions chatCompletions = client.getChatCompletions(config.getDeploymentId(), getChatCompletionsOptions(chatMessages));
System.out.printf("Model ID=%s is created at %s.%n", chatCompletions.getId(), chatCompletions.getCreatedAt());
for (ChatChoice choice : chatCompletions.getChoices()) {
ChatResponseMessage message = choice.getMessage();
System.out.printf("Index: %d, Chat Role: %s.%n", choice.getIndex(), message.getRole());
System.out.println("Message:");
System.out.println(message.getContent());
}
System.out.println();
CompletionsUsage usage = chatCompletions.getUsage();
System.out.printf("Usage: number of prompt token is %d, "
+ "number of completion token is %d, and number of total tokens in request and response is %d.%n",
usage.getPromptTokens(), usage.getCompletionTokens(), usage.getTotalTokens());
Maven Dependencies : (properties)
<java.version>17</java.version>
<spring-cloud-azure.version>4.9.0</spring-cloud-azure.version>
<azure-search.version>11.6.0-beta.8</azure-search.version>
<azure-openai.version>1.0.0-beta.6</azure-openai.version>
<semantic-kernel.version>0.2.9-alpha</semantic-kernel.version>
<mockito-inline.version>4.5.1</mockito-inline.version>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-search-documents</artifactId>
<version>${azure-search.version}</version>
<exclusions>
<exclusion>
<groupId>com.azure</groupId>
<artifactId>azure-core-serializer-json-jackson</artifactId>
</exclusion>
</exclusions>
</dependency>
Hi all
We have a newer version of Azure OpenAI SDK, 1.0.0-beta.9 to replace the jackson-databind
dependency by azure-json
package. It should solve this unexpected property issue.
<dependencies>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-ai-openai</artifactId>
<version>1.0.0-beta.9</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-search-documents</artifactId>
<version>11.6.0</version>
</dependency>
</dependencies>
Describe the bug
Calling Azure OpenAI
openAIClient.getChatCompletions()
results in an error related tofunction_call
.Exception or Stack Trace
To Reproduce
In a Java class...
List<ChatMessage> messages
(or receive them from a remote client)ChatCompletionsOptions
objectgetChatCompletions()
- this will throw an errorCode Snippet
Expected behavior
Calls to
openAIClient.getChatCompletions()
should be successful and return aChatCompletions
object.Screenshots
N/A
Setup:
Additional context
This was working in the beta2 release. It is likely related to the inclusion of the new Functions feature in the beta3 release.
Information Checklist