Closed chadananda closed 1 year ago
I am also having this error.
@felipebutcher @chadananda I'm not affiliated with this project, so take this with a grain of salt, but you need to pass the correct "model" argument. For example, "gpt-3.5-turbo" doesn't support functions
, but "gpt-3.5-turbo-0613" does.
I was able to get this to work using gpt-3.5-turbo-0613
. But I found another requirement that may be mentioned in the OpenAI docs but I didn't see it. The JSON schema passed must be an object at the root.
This won't work:
const schema = {
type: "array",
items: {
type: "object",
properties: {
title: { type: "string" },
description: { type: "string" },
},
required: ["title", "description"],
},
};
But this will work:
const schema = {
type: "object",
properties: {
nodes: {
type: "array",
items: {
type: "object",
properties: {
title: { type: "string" },
description: { type: "string" },
},
required: ["title", "description"],
},
},
},
required: ["nodes"],
};
Ooops. This is a server response due to the fact that function call capability is presently supported by only two models. When I switched to gpt-3.5-turbo-0613 it worked fine.
Could we have a complete example?
@etiennea Here you go, this is the example from their announcement page in Typescript...
import { Configuration, OpenAIApi } from "openai";
const openai = new OpenAIApi(new Configuration({
apiKey: env.OPENAI_KEY // <-- YOUR KEY
}));
const gptResponse = await openai.createChatCompletion({
model: "gpt-3.5-turbo-0613",
messages: [
{
role: "user",
content: "What is the weather like in Boston?"
}
],
functions: [
{
name: "get_current_weather",
description: "Get the current weather in a given location",
parameters: {
type: "object",
properties: {
location: {
type: "string",
description: "The city and state, e.g. San Francisco, CA"
},
unit: {
type: "string",
enum: ["celsius", "fahrenheit"]
}
},
required: ["location"]
}
}
]
});
// finish_reason should be "function_call"
console.log(gptResponse.data.choices[0].finish_reason)
Note the model version as others have said
I just tested and looks like it's already available on model gpt-4-0613 as well.
It's working for me when the function has parameters. But I'm trying to send the function without parameters and is not detecting it
user_prompt: "Call Santa please"
Function
{
name: 'call_santa_claus',
description: "This function calls Santa Claus, Don't worry we have the number"
}
Also tried with parameters: null
{
name: 'call_santa_claus',
description: "This function calls Santa Claus, Don't worry we have the number",
parameters: null
}
The function call_santa_claus
is not called. OpenAI does not tell me to call it. Any idea?
You do not need to have an actual function i.e. "function name() { return ... }".
However, you do need to provide all the other parameters, thus "pretend" that there is an actual function: thus still reference a name of a function even though it is just a function name.
On Wed, 28 Jun 2023 at 12:19, Andrés @.***> wrote:
It's working for me when the function has parameters. But I'm trying to send the function without parameters and is not detecting it
user_prompt: "Call Santa please"
Function
{ name: 'call_santa_claus', description: "This function calls Santa Claus, Don't worry we have the number" }
Also tried with parameters: null
{ name: 'call_santa_claus', description: "This function calls Santa Claus, Don't worry we have the number", parameters: null }
The function call_santa_claus is not called. OpenAI does not tell me to call it. Any idea?
— Reply to this email directly, view it on GitHub https://github.com/openai/openai-node/issues/181#issuecomment-1611146721, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABKTZUX44TFQG23L2R2LO63XNQAJRANCNFSM6AAAAAAZJJZUYQ . You are receiving this because you are subscribed to this thread.Message ID: @.***>
I solved my case like this
{
name: 'call_santa_claus',
description: "This function calls Santa Claus, Don't worry we have the number",
parameters: { 'type': 'object', properties: {} }
}
You have to send parameters even when you don't have any. That worked for me
Do you know when this will be fixed on Azure OpenAI services/models? Looks like all 0613 models have this issue and return the same error message.
@remixtedi can you provide sample code for repro? Are you hitting /engines/gpt-3.5-turbo-0613
or equivalent?
@rattrayalex
Using OpenAI Azure service and Betalgo.OpenAI package.
Here is a sample code that can be used to test:
using OpenAI;
using OpenAI.Managers;
using OpenAI.ObjectModels.RequestModels;
var openAIClient = new OpenAIService(new OpenAiOptions()
{
ApiKey = "YOUR API KEY",
DeploymentId = "YOUR DEPLOYMENT ID",
ResourceName = "YOUR RESOURCE NAME",
ProviderType = ProviderType.Azure,
ApiVersion = "2023-06-01-preview",
DefaultModelId = "gpt-35-turbo-0613"
});
var func = new FunctionDefinitionBuilder("generate_brand_color", "Generates a brand color based on the brand name");
func.AddParameter("HexCode", "string", "Hex code of the brand color");
var chatCompletionCreateRequest = new ChatCompletionCreateRequest
{
Messages = new ChatMessage[]
{
ChatMessage.FromSystem("You are helping a customer to generate brand color"),
ChatMessage.FromUser("I need a brand color for my brand, brand name is Shambala and it is a clothing brand")
},
Model = "gpt-35-turbo-0613",
Temperature = 0.9f,
MaxTokens = 150,
TopP = 1,
FrequencyPenalty = 0,
PresencePenalty = 0,
FunctionCall = new Dictionary<string, string> { { "name", "generate_brand_color" } },
Functions = new List<FunctionDefinition> { func.Build() }
};
var chatCompletion = await openAIClient.ChatCompletion.CreateCompletion(chatCompletionCreateRequest);
Console.WriteLine(chatCompletion.Choices.First().Message);
@remixtidi Azure OpenAI does not yet support functions, see the blog post here: https://techcommunity.microsoft.com/t5/ai-cognitive-services-blog/introducing-new-and-updated-models-to-azure-openai-service/ba-p/3860351
Function calling will be coming soon and will only be available for the new model versions announced today.
Describe the bug
I'm trying to use the functions parameter to help force JSON response. Is this not supported?
To Reproduce
Call createChatCompletion with 'functions' array.
Code snippets
Results in the error: