Closed sashabaranov closed 1 year ago
Which fields in the functions
are required or optional? It may be necessary to consider, for example, making the type pointer depending on the field.
refs: https://openai.com/blog/function-calling-and-other-api-updates
{
"model":"gpt-3.5-turbo-0613",
"messages":[
{
"role":"user",
"content":"What is the weather like in Boston?"
}
],
"functions":[
{
"name":"get_current_weather",
"description":"Get the current weather in a given location",
"parameters":{
"type":"object",
"properties":{
"location":{
"type":"string",
"description":"The city and state, e.g. San Francisco, CA"
},
"unit":{
"type":"string",
"enum":[
"celsius",
"fahrenheit"
]
}
},
"required":[
"location"
]
}
}
]
}
The official documentation related to functions
has been updated.
https://platform.openai.com/docs/api-reference/chat/create#chat/create-functions
https://platform.openai.com/docs/guides/gpt/function-calling
„note: the model may generate invalid JSON or hallucinate parameters“
Perhaps we should ensure that we handle this situation properly.
It would be nice to use an external jsonschema library, a challenge with implementing a new one is the range of features to support, eg JSON Schema supports the 'example' field, even though it's not shown in the OpenAI docs, I suspect that the models would be able to make use of it. That being said, there are many competing libraries and it's probably hard to pick one which everyone would be happy with.
I wonder if, since this library only needs to properly marshal the request, if it warrants a reimplementation of a very limited subset of JSON schema features -- eg no validation, no reflection etc, just defining the spec and asserting that inputs conform to that spec. Other options might be to treat the function definition as a json.RawMessage
and make the user explicitly responsible for encoding it.
WRT handling JSON -- I wrote an opinionated implementation for how to handle generating JSON schemas via reflection and parsing/validating them from string, see https://github.com/stillmatic/gollum/blob/main/functions_test.go#L133. The implementation itself is quite terse (~50 lines) but does make a fair number of assumptions (e..g that you have a FunctionInput
struct for each function). IMO this logic is very useful but makes too many assumptions for a base library. The test file also shows an implementation of the extended ChatCompletionRequest
interface, where I embedded the existing one and just added the new fields. There does not need to be many changes to the core library to support functions, basically none of the existing logic needs to be changed, just updating the I/O formats and enums IMO
There does not need to be many changes to the core library to support functions, basically none of the existing logic needs to be changed, just updating the I/O formats and enums IMO
I agree with this. It can be quite simple, and leave the details up to the end user. At the very minimum we could have the JSON schema input just be a []byte that the user has to prepare elsewhere, using whichever library they prefer.
The returned function call could be exposed as a simple interface{} or []byte, and leave it up to the user to unmarshal the JSON themselves and decide how to handle any related errors.
I saw that there was a new release (v1.11.0) with function calls implemented. However, they are not accessible in streaming responses. I couldn't find official OpenAI docs on this but this seems to address the issue. I think this is an easy addition, *FunctionCall should be added as a field to ChatCompletionStreamChoiceDelta as far as I can tell
This PR https://github.com/sashabaranov/go-openai/pull/373 feels like a good example of why implementing JSONSchema is tricky. using json.RawMessage
in the call should work, and will be more resilient to all the little edgecases to handle.
I have a tested implementation here, which shows that you can successfully call OpenAI with the bytes array: https://github.com/stillmatic/gollum/commit/b10c270cc853054e0d8172ed1bd94c548f343b63/
@stillmatic @jmacwhyte @sashabaranov Congrats! v1.11.3 has been released with #377. I've learned a lot from our discussions. Thank you!
ChatCompletionRequest.FunctionCall
As by OpenAI docs: function_call is a string or object (Optional)
In the second case, we can't pass an object like:
FunctionCall: `{"name": "extracted_offers_data"}`,
Since it would marshal into an escaped JSON string instead of a JSON object.
OpenAI returns:
error, status code: 400, message: '$.function_call' is invalid. Please check the API reference: https://platform.openai.com/docs/api-reference.
v1.11.3
Seems to work!
These features have already been released, so I'm closing this issue.
See updates at: https://openai.com/blog/function-calling-and-other-api-updates
From the top of my mind
*-0613
models (e.g.gpt-4-0613
)