Closed hansjm10 closed 9 months ago
This looks pretty promising! Thank you for sharing.
I'm just trying it out and found something that looks like a typo to me:
As I only add a single function, i would expect the class to be named "Function". Or do I maybe miss a point of the class?
Either way, thanks again for adding this. I hope this gets merged here as Microsoft is not supporting functions calls in their new OpenAI package as of right now.
What about chat endpoint requests? And function parameters description as JObject
is not good idea.
What about chat endpoint requests?
I figured chat endpoint requests can come later as this is largely adding in the backend for function support. I can add implementation though if it's needed now.
function parameters description as
JObject
is not good idea.
Do you have any suggestions as I don't like it very much either. The model is expecting some sort of JSON object and fails if you send JSON serialized as a string.
I was thinking about a way to pass a type instead of the parameters and the Function would take the type information from that type and generate Jsonschema from it. I saw already that Newtonsoft has Classes to generate JsonSchema but did not had the time to look into it. The nice thing about passing a type would be, that you can later use that type to deserialize the function call from ChatGpt. This is not realy thought through at the moment. Just some brainstorming i made a few hours ago.
Jordan Hans @.***> schrieb am Fr., 16. Juni 2023, 19:04:
What about chat endpoint requests? I figured chat endpoint requests can come later as this is largely adding in the backend for function support. function parameters description as JObject is not good idea. Do you have any suggestions as I don't like it very much either. The model is expecting some sort of JSON object and fails if you send JSON serialized as a string.
— Reply to this email directly, view it on GitHub https://github.com/OkGoDoIt/OpenAI-API-dotnet/pull/149#issuecomment-1594991242, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAI7QNHM4TKGPM5QEPBQETTXLSG3JANCNFSM6AAAAAAZIVXZ4I . You are receiving this because you commented.Message ID: @.***>
Could not do more testing atm. But this seems to work fine, at least till serialization:
Function.cs
using Newtonsoft.Json.Schema;
using Newtonsoft.Json.Schema.Generation;
//...
[JsonProperty("parameters", Required = Required.Default)]
public JSchema Parameters { get; set; }
public Function(string name, string description, Type type)
{
this.Name = name;
this.Description = description;
this.Parameters = new JSchemaGenerator().Generate(type); // <--
}
Test
private class TestClass
{
public string TestString { get; set; }
public int TestInt { get; set; }
public bool TestBool { get; set; }
}
[Test]
public void TestFunctionWithSchema(){
var fn = new Function("s","d",typeof(TestClass));
var str = JsonConvert.SerializeObject(fn);
}
Unfortunately It looks like Newtonsofts JsonSchema licensing is AGPL and only supports up to 1000 requests per hour without purchasing it. However it is worth exploring the idea of easily passing in the parameters schema using a Type. I previously was using NJsonSchema for this but didn't like the idea of including outside packages, that weren't already being used in the project.
The benefit to using the JObject is we can easily convert other formats to it. The downside is it isn't an easy way to natively generate the schema. I thought about including a basic schema generator but felt it was outside the scope of the PR and project as a whole.
I agree on keeping external packages to a minimum. With some AI support i came up with a simple JSON Schema generator, that might help here. I just did a first test with it, but as its getting pretty late here i cannot realy concentrate very good :D
Moved code to a gist, so i dont blow up the discussion here. https://gist.github.com/tkoenig89/e35d6ffc2979746476893fe00234ab30
If this looks like a way to go. I'm glad to provide a cleaned up version with a bunch of tests. After some sleep :)
There is my solution:
public interface IJsonSchema
{
string Type { get; }
string? Description { get; set; }
}
public class JsonObjectSchema : IJsonSchema
{
[JsonPropertyName("type")]
public string Type { get; } = "object";
[JsonPropertyName("description")]
public string? Description { get; set; }
[JsonPropertyName("properties")]
public required Dictionary<string, IJsonSchema> Properties { get; set; }
[JsonPropertyName("required")]
public List<string>? Required { get; set; }
}
public class JsonArraySchema : IJsonSchema
{
[JsonPropertyName("type")]
public string Type { get; } = "array";
[JsonPropertyName("description")]
public string? Description { get; set; }
[JsonPropertyName("items")]
public required IJsonSchema Items { get; set; }
[JsonPropertyName("minItems")]
public int? MinItems { get; set; }
[JsonPropertyName("maxItems")]
public int? MaxItems { get; set; }
[JsonPropertyName("uniqueItems")]
public bool? UniqueItems { get; set; }
}
public class JsonStringSchema : IJsonSchema
{
[JsonPropertyName("type")]
public string Type { get; } = "string";
[JsonPropertyName("description")]
public string? Description { get; set; }
[JsonPropertyName("enum")]
public List<string>? Enum { get; set; }
[JsonPropertyName("pattern")]
public string? Pattern { get; set; }
[JsonPropertyName("minLength")]
public int? MinLength { get; set; }
[JsonPropertyName("maxLength")]
public int? MaxLength { get; set; }
}
public class JsonNumberSchema : IJsonSchema
{
[JsonPropertyName("type")]
public string Type { get; } = "number";
[JsonPropertyName("description")]
public string? Description { get; set; }
[JsonPropertyName("minimum")]
public double? Minimum { get; set; }
[JsonPropertyName("maximum")]
public double? Maximum { get; set; }
[JsonPropertyName("multipleOf")]
public double? MultipleOf { get; set; }
[JsonPropertyName("exclusiveMaximum")]
public double? ExclusiveMaximum { get; set; }
[JsonPropertyName("exclusiveMinimum")]
public double? ExclusiveMinimum { get; set; }
}
public class JsonBooleanSchema : IJsonSchema
{
[JsonPropertyName("type")]
public string Type { get; } = "boolean";
[JsonPropertyName("description")]
public string? Description { get; set; }
}
public class JsonSchemaConverter : JsonConverter<IJsonSchema>
{
public override IJsonSchema? Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
{
throw new NotImplementedException();
}
public override void Write(Utf8JsonWriter writer, IJsonSchema value, JsonSerializerOptions options)
{
JsonSerializer.Serialize(writer, value, value.GetType(), options);
}
}
Usage:
var mySchema = new JsonObjectSchema
{
Properties = new Dictionary<string, IJsonSchema>
{
["name"] = new JsonStringSchema
{
Description = "The name of the person",
Enum = new List<string> { "Alice", "Bob", "Charlie" },
MinLength = 1,
MaxLength = 100
},
["age"] = new JsonNumberSchema
{
Description = "The age of the person",
Minimum = 0,
Maximum = 150,
MultipleOf = 1
},
["hobbies"] = new JsonArraySchema
{
Items = new JsonStringSchema()
{
MaxLength = 30
},
MinItems = 0,
MaxItems = 100,
UniqueItems = true
},
["is_married"] = new JsonBooleanSchema
{
Description = "Marital status of the person"
}
},
Required = new List<string> { "name", "age" }
};
var options = new JsonSerializerOptions { WriteIndented = true, DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull };
options.Converters.Add(new JsonSchemaConverter());
var jsonString = JsonSerializer.Serialize(mySchema, options);
Console.WriteLine(jsonString);
Output:
{
"type": "object",
"properties": {
"name": {
"type": "string",
"description": "The name of the person",
"enum": [
"Alice",
"Bob",
"Charlie"
],
"minLength": 1,
"maxLength": 100
},
"age": {
"type": "number",
"description": "The age of the person",
"minimum": 0,
"maximum": 150,
"multipleOf": 1
},
"hobbies": {
"type": "array",
"items": {
"type": "string",
"maxLength": 30
},
"minItems": 0,
"maxItems": 100,
"uniqueItems": true
},
"is_married": {
"type": "boolean",
"description": "Marital status of the person"
}
},
"required": [
"name",
"age"
]
}
It can serialize, but not deserialize. But deserialization is not required.
I have done some adjustments to the parameters type and setter
[JsonProperty("parameters", Required = Required.Default)]
public object Parameters
{
get
{
return _parameters;
}
set
{
try
{
if (value is string jsonStringValue)
{
_parameters = JObject.Parse(jsonStringValue);
}
if (value is JObject jObjectValue)
{
_parameters = jObjectValue;
}
else
{
var settings = new JsonSerializerSettings
{
NullValueHandling = NullValueHandling.Ignore
};
var jsonString = JsonConvert.SerializeObject(value, settings);
_parameters = JObject.Parse(jsonString);
}
}
catch (JsonException e)
{
throw new ArgumentException("Could not convert the provided object into a JSON object. Make sure that the object is serializable and its structure matches the required schema.", e);
}
}
}
This avoid implementing our own JsonSchema as the user will be able to provide their own which is arguably going to be better than anything we need to implement and support in the future. If the user wants to use a schema like Newtonsoft, NJsonSchema, or their own implementation it can handle it, if the user passes in a dictionary<string, obj> it can handle it, or if the user passes in a string formatted as JSON it can handle it.
@ClusterM I tested this using your implementation (adjusting it slightly for support with Newtonsoft over System.Text.Json) and the original JsonObjectSchema and Parameters were identical.
If the user doesn't provide the needed schema they will receive a 400 error from the OpenAI endpoint.
Do we have any objections to this handling?
Sounds reasonable to me. Using object instead of JObject makes it more accesible and less dependent on Newtonsoft (the code using this package).
Jordan Hans @.***> schrieb am Sa., 17. Juni 2023, 18:06:
I have done some adjustments to the parameters type and setter
[JsonProperty("parameters", Required = Required.Default)] public object Parameters { get { return _parameters; } set { try { if (value is string jsonStringValue) { _parameters = JObject.Parse(jsonStringValue); } if (value is JObject jObjectValue) { _parameters = jObjectValue; } else { var settings = new JsonSerializerSettings { NullValueHandling = NullValueHandling.Ignore }; var jsonString = JsonConvert.SerializeObject(value, settings); _parameters = JObject.Parse(jsonString); } } catch (JsonException e) { throw new ArgumentException("Could not convert the provided object into a JSON object. Make sure that the object is serializable and its structure matches the required schema.", e); } } }
This avoid implementing our own JsonSchema as the user will be able to provide their own which is arguably going to be better than anything we need to implement and support in the future. If the user wants to use a schema like Newtonsoft, NJsonSchema, or their own implementation it can handle it, if the user passes in a dictionary<string, obj> it can handle it, or if the user passes in a string formatted as JSON it can handle it.
@ClusterM https://github.com/ClusterM I tested this using your implementation (adjusting it slightly for support with Newtonsoft over System.Text.Json) and the original JsonObjectSchema and Parameters were identical.
If the user doesn't provide the needed schema they will receive a 400 error from the OpenAI endpoint.
Do we have any objections to this handling?
— Reply to this email directly, view it on GitHub https://github.com/OkGoDoIt/OpenAI-API-dotnet/pull/149#issuecomment-1595795305, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAI7QNDLH2BK7TAGXJHMIRDXLXIY3ANCNFSM6AAAAAAZIVXZ4I . You are receiving this because you commented.Message ID: @.***>
@hansjm10 you might want to use JObject.FromObject(value)
instead of serializing to string and parsing as JObject, At least this seemed to be the simplest solution i used when i tested the original version of the PR.
I can't get this PR to work when streaming the chat completion. Anyone else having issues with this, or am I just doing something wrong?
@ErikDombi I can't get this PR to work when streaming the chat completion. Anyone else having issues with this, or am I just doing something wrong?
Mind sending what you have tried and your response from OpenAI? I am able to use
IAsyncEnumerable<string> asyncEnumerable = conversation.StreamResponseEnumerableFromChatbotAsync();
await foreach(var res in asyncEnumerable)
{
Console.WriteLine(res);
}
and the returned streamed results are correct
Here is an updated test case that follows the workflow in the example code from OpenAI. See the complete python example at https://platform.openai.com/docs/guides/gpt/function-calling
public async Task SummarizeFunctionResult()
{
try
{
var api = new OpenAI_API.OpenAIAPI();
var functionList = new List<Function>
{
BuildFunctionForTest()
};
var conversation = api.Chat.CreateConversation(new ChatRequest {
Model = Model.ChatGPTTurbo0613,
Functions = functionList
});
conversation.AppendUserInput("What is the weather like in Boston?");
var response = await conversation.GetResponseFromChatbotAsync();
Assert.IsNull(response);
var functionMessage = new ChatMessage
{
Role = ChatMessageRole.Function,
Name = "get_current_weather",
Content = "{\"temperature\": \"22\", \"unit\": \"celsius\", \"description\": \"sunny\"}"
};
conversation.AppendMessage(functionMessage);
response = await conversation.GetResponseFromChatbotAsync();
Assert.AreEqual("The current weather in Boston is sunny with a temperature of 22 degrees Celsius.", response);
}
catch(NullReferenceException ex)
{
Console.WriteLine(ex.Message, ex.StackTrace);
Assert.False(true);
}
}
Is there any example how to extract function name and arguments from GetResponseFromChatbotAsync?
conversation.AppendUserInput("What is the weather like in Boston?");
returns null.
Is there any example how to extract function name and arguments from GetResponseFromChatbotAsync?
conversation.AppendUserInput("What is the weather like in Boston?");
returns null.
The newest commit adds in new chat endpoints. You can use
var response = await conversation.GetFunction_CallResponseAsync()
and access function names and arguments like
var name = response.Name;
var arguments = response.Arguments;
Is there any example how to extract function name and arguments from GetResponseFromChatbotAsync?
conversation.AppendUserInput("What is the weather like in Boston?");
returns null.The newest commit adds in new chat endpoints. You can use
var response = await GetFunction_CallResponseAsync()
and access function names and arguments like
var name = response.Name; var arguments = response.Arguments;
Thanks for quick reply.
Open cases:
Example: User: Hello, Bot: Hi, i'm bot and your assistant. User: What's the weather in Boston? Bot: $function$
CallResponseAsync should return either message or function, since you don't always know if you need to call a function or not
Going to take some time later to think about this situation and see how the official documentation handles this. FWIW You can easily check if the response is null for a function_call if your expecting it.
var response = await conversation.GetFunction_CallResponseAsync();
// The user input didn't trigger a function call.
if(response == null)
{
Console.WriteLine(conversation.Messages.Last().content)
}
as with streaming a function that is a bit weird since it supposed to be for internal use for accessing another API. If you could come up with a use case I can look into it more. That being said I'm not to familiar with the streaming side if somebody else wants to look into it too and see if there is a good solution.
CallResponseAsync should return either message or function, since you don't always know if you need to call a function or not
Going to take some time later to think about this situation and see how the official documentation handles this. FWIW You can easily check if the response is null for a function_call if your expecting it.
var response = await conversation.GetFunction_CallResponseAsync(); // The user input didn't trigger a function call. if(response == null) { Console.WriteLine(conversation.Messages.Last().content) }
as with streaming a function that is a bit weird since it supposed to be for internal use for accessing another API. If you could come up with a use case I can look into it more. That being said I'm not to familiar with the streaming side if somebody else wants to look into it too and see if there is a good solution.
I did some hacks on my side but would like to see the production ready PR. Streaming is also working on my side (with hacks).
Use case (chatbot) with streaming:
In both cases there should be a single CallResponseAsync method which returns function or message. Because you don't know in advance if function will be called or not.
In both cases there should be a single CallResponseAsync method which returns function or message. Because you don't know in advance if function will be called or not.
These are two different return types. If you are expecting the possibility of a function you should be checking if the resulted response is a function anyway. If you want more control over this you can use CreateChatCompletionAsync
which returns the ChatResult. This might be what you are looking for.
@hansjm10 Can you also add the following method to Conversation.cs ? I think this would make the solution more complete.
/// <summary>
/// Creates and appends a <see cref="ChatMessage"/> to the chat history with the Role of <see cref="ChatMessageRole.Function"/>. The function message is a response to a request from the system for output from a predefined function.
/// </summary>
/// <param name="functionName">The name of the function for which the content has been generated as the result</param>
/// <param name="content">The text content (usually JSON)</param>
public void AppendFunctionMessage(string functionName, string content) => AppendMessage(new ChatMessage(ChatMessageRole.Function, content) { Name = functionName });
Function_Call
violates almost all naming rules in C#, why mixing _ with PascalCases?
Some feedback on usability of the functions:
Not a fan of "GetResponseFromChatbotAsync" returning null when the action has choices. Would like to see another routine added called "GetResultFromChatbotAsync" with the result since the likelihood of using functions will only increase over time & using the first message of Choice[0] feels hinky requiring the need to go back to the Conversation object to access "MostRecentApiResult".
public async Task<ChatResult> GetResultFromChatbotAsync()
{
ChatRequest req = new ChatRequest(RequestParameters);
req.Messages = _Messages.ToList();
var res = await _endpoint.CreateChatCompletionAsync(req);
MostRecentApiResult = res;
return res;
}
Overall, I hope this gets merged sooner than later since it is functional
Not a fan of "GetResponseFromChatbotAsync" returning null when the action has choices.
@logikonline I hear what you're saying although there is already a method CreateChatCompletionAsync
which gives you full access to the response. GetResponseFromChatbotAsync
is more of a convenience function for strictly implementing basic chat functionality. A more advanced use case requiring access to functions makes that just another approach for the same functionality.
@brondavies I am not hung up so much on the name, only the functionality - the change is fine to me.
I prefer the "GetResultFromChatbotAsync" only because it stays inline with the workflow of the Conversation object.
I would maybe make a convenience enum to the ChatResult called "NextAction" which makes the flow even more developer friendly.
public enum Action { Stop, EOS, Function, Multiple, Unknown }
public class ChatResult : ApiResultBase
{
...
public Action NextAction { get; set; }
...
}
public async Task<ChatResult> GetResultFromChatbotAsync()
{
ChatRequest req = new ChatRequest(RequestParameters);
req.Messages = _Messages.ToList();
var res = await _endpoint.CreateChatCompletionAsync(req);
MostRecentApiResult = res;
if (res != null)
{
if (res.Choices.Count == 1)
{
switch (res.Choices[0].FinishReason.ToLower())
{
case "stop":
res.NextAction = Action.Stop;
break;
case "eos":
res.NextAction = Action.EOS;
break;
case "function_call":
res.NextAction = Action.Function;
break;
default:
res.NextAction = Action.Unknown;
break;
}
}
else if (res.Choices.Count > 1)
{
res.NextAction = Action.Multiple;
}
else
{
res.NextAction = Action.Unknown;
}
}
return res;
}
Then the "Multiple" can allow for different workflows in code but this gives the developer easy segmenting. Otherwise I will subclass and do this myself.
@OkGoDoIt I think this is ready to merge
@OkGoDoIt ^
Got this error running in Unity:
Sender:System.Threading.Tasks.Task`1[System.Threading.Tasks.VoidTaskResult] Exception:System.AggregateException: A Task's exception(s) were not observed either by Waiting on the Task or accessing its Exception property. As a result, the unobserved exception was rethrown by the finalizer thread. (Error creating 'OpenAI_API.ChatFunctions.FunctionCallConverter'.) ---> Newtonsoft.Json.JsonException: Error creating 'OpenAI_API.ChatFunctions.FunctionCallConverter'. ---> Newtonsoft.Json.JsonException: No parameterless constructor defined for 'OpenAI_API.ChatFunctions.FunctionCallConverter'.
at Newtonsoft.Json.Serialization.JsonTypeReflector+<>c__DisplayClass22_0.<GetCreator>b__0 (System.Object[] parameters) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.JsonTypeReflector.GetJsonConverter (System.Object attributeProvider) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.SetPropertySettingsFromAttributes (Newtonsoft.Json.Serialization.JsonProperty property, System.Object attributeProvider, System.String name, System.Type declaringType, Newtonsoft.Json.MemberSerialization memberSerialization, System.Boolean& allowNonPublicAccess) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.CreateProperty (System.Reflection.MemberInfo member, Newtonsoft.Json.MemberSerialization memberSerialization) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.CreateProperties (System.Type type, Newtonsoft.Json.MemberSerialization memberSerialization) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.CreateObjectContract (System.Type objectType) [0x00000] in <00000000000000000000000000000000>:0
at System.Collections.Concurrent.ConcurrentDictionary`2[TKey,TValue].GetOrAdd (TKey key, System.Func`2[T,TResult] valueFactory) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.Serialize (Newtonsoft.Json.JsonWriter jsonWriter, System.Object value, System.Type objectType) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.JsonSerializer.SerializeInternal (Newtonsoft.Json.JsonWriter jsonWriter, System.Object value, System.Type objectType) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.JsonConvert.SerializeObjectInternal (System.Object value, System.Type type, Newtonsoft.Json.JsonSerializer jsonSerializer) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase.HttpRequestRaw (System.String url, System.Net.Http.HttpMethod verb, System.Object postData, System.Boolean streaming) [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1[TResult].Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase.HttpRequestRaw (System.String url, System.Net.Http.HttpMethod verb, System.Object postData, System.Boolean streaming) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase+<HttpStreamingRequest>d__16`1[T].MoveNext () [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase+<HttpStreamingRequest>d__16`1[T].System.Collections.Generic.IAsyncEnumerator<T>.MoveNextAsync () [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.Chat.ChatEndpoint.StreamCompletionAsync (OpenAI_API.Chat.ChatRequest request, System.Action`2[T1,T2] resultHandler, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.Chat.ChatEndpoint.StreamCompletionAsync (OpenAI_API.Chat.ChatRequest request, System.Action`2[T1,T2] resultHandler, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at Xrai.Util.Gpt.ExecuteGptQuery (System.String prompt, System.Action`2[T1,T2] resultHandler, System.Boolean ephemeral, System.String system_message, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at CaptionHUD.FixedUpdate () [0x00000] in <00000000000000000000000000000000>:0
--- End of inner exception stack trace ---
at Newtonsoft.Json.Serialization.JsonTypeReflector+<>c__DisplayClass22_0.<GetCreator>b__0 (System.Object[] parameters) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.JsonTypeReflector.GetJsonConverter (System.Object attributeProvider) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.SetPropertySettingsFromAttributes (Newtonsoft.Json.Serialization.JsonProperty property, System.Object attributeProvider, System.String name, System.Type declaringType, Newtonsoft.Json.MemberSerialization memberSerialization, System.Boolean& allowNonPublicAccess) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.CreateProperty (System.Reflection.MemberInfo member, Newtonsoft.Json.MemberSerialization memberSerialization) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.CreateProperties (System.Type type, Newtonsoft.Json.MemberSerialization memberSerialization) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.CreateObjectContract (System.Type objectType) [0x00000] in <00000000000000000000000000000000>:0
at System.Collections.Concurrent.ConcurrentDictionary`2[TKey,TValue].GetOrAdd (TKey key, System.Func`2[T,TResult] valueFactory) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.Serialize (Newtonsoft.Json.JsonWriter jsonWriter, System.Object value, System.Type objectType) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.JsonSerializer.SerializeInternal (Newtonsoft.Json.JsonWriter jsonWriter, System.Object value, System.Type objectType) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.JsonConvert.SerializeObjectInternal (System.Object value, System.Type type, Newtonsoft.Json.JsonSerializer jsonSerializer) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase.HttpRequestRaw (System.String url, System.Net.Http.HttpMethod verb, System.Object postData, System.Boolean streaming) [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1[TResult].Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase.HttpRequestRaw (System.String url, System.Net.Http.HttpMethod verb, System.Object postData, System.Boolean streaming) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase+<HttpStreamingRequest>d__16`1[T].MoveNext () [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase+<HttpStreamingRequest>d__16`1[T].System.Collections.Generic.IAsyncEnumerator<T>.MoveNextAsync () [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.Chat.ChatEndpoint.StreamCompletionAsync (OpenAI_API.Chat.ChatRequest request, System.Action`2[T1,T2] resultHandler, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.Chat.ChatEndpoint.StreamCompletionAsync (OpenAI_API.Chat.ChatRequest request, System.Action`2[T1,T2] resultHandler, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at Xrai.Util.Gpt.ExecuteGptQuery (System.String prompt, System.Action`2[T1,T2] resultHandler, System.Boolean ephemeral, System.String system_message, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at CaptionHUD.FixedUpdate () [0x00000] in <00000000000000000000000000000000>:0
--- End of stack trace from previous location where exception was thrown ---
at OpenAI_API.EndpointBase+<HttpStreamingRequest>d__16`1[T].MoveNext () [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase+<HttpStreamingRequest>d__16`1[T].System.Collections.Generic.IAsyncEnumerator<T>.MoveNextAsync () [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.Chat.ChatEndpoint.StreamCompletionAsync (OpenAI_API.Chat.ChatRequest request, System.Action`2[T1,T2] resultHandler, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.Chat.ChatEndpoint.StreamCompletionAsync (OpenAI_API.Chat.ChatRequest request, System.Action`2[T1,T2] resultHandler, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at Xrai.Util.Gpt.ExecuteGptQuery (System.String prompt, System.Action`2[T1,T2] resultHandler, System.Boolean ephemeral, System.String system_message, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at CaptionHUD.FixedUpdate () [0x00000] in <00000000000000000000000000000000>:0
--- End of stack trace from previous location where exception was thrown ---
StreamChatEnumerableAsync works in a strange way with functions. Is there any working example?
I found only the way to use it
var conversation = api.Chat.CreateConversation(new ChatRequest
{
Model = Model.GPT4_0613,
Functions = functionsList,
Temperature = 0.10,
Messages = messages
});
try
{
string currentFunction = "";
var functionCalls = new Dictionary<string, string>();
await foreach (var result in api.Chat.StreamChatEnumerableAsync(conversation.RequestParameters).ConfigureAwait(false))
{
if (result.Choices[0] != null && result.Choices[0].FinishReason == "function_call")
{
foreach (var call in functionCalls)
{
// parse json and call manually
}
}
else if (result.Choices[0].Delta.FunctionCall?.Arguments != null)
{
// Only first entrance has function Name
if (result.Choices[0].Delta.FunctionCall.Name != null)
{
currentFunction = result.Choices[0].Delta.FunctionCall.Name;
functionCalls.Add(currentFunction, "");
}
else
{
// collecting json with arguments
functionCalls[currentFunction] += result.Choices[0].Delta.FunctionCall.Arguments;
}
}
foreach (var choice in result.Choices.Where(choice => !string.IsNullOrWhiteSpace(choice.Delta?.Content)))
{
streamGetter(choice.Delta.Content);
}
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
Hey everyone, I'm sorry for the delay on my end. I've been swamped with my day job and I'm behind on triaging PR's here. I will do my best to get to this within the next few days. Thanks for your understanding 😅
Efficient implementation of function calling in conjunction with streaming: https://github.com/lofcz/OpenAI-API-dotnet/commit/52b78842d8d223441732f892b29d0d1578b61c0f#diff-8bd7aaf5181bcc44301c5b518a91bee8f7e4591461c1c19dc8a4a55e74354e07R383
@hansjm10 I think this event-based approach and abstracting buffering away from the end-user presents a better experience for API consumers.
@OkGoDoIt can you provide us with an additional update on this?
@hansjm10 I've fixed the issue where function_call couldn't be used properly during re-streaming. Now, function_call can be used normally in all scenarios. detail.Also, I've added example code for using stream transmission. It's located in test/ChatEndpointTest.cs, within the method named SummarizeFunctionStreamResult.
@hansjm10 I've fixed the issue where function_call couldn't be used properly during re-streaming. Now, function_call can be used normally in all scenarios. detail.Also, I've added example code for using stream transmission. It's located in test/ChatEndpointTest.cs, within the method named SummarizeFunctionStreamResult.
Looks good and I'm ready to merge it but just revert the changes to the proj files.
Looks good and I'm ready to merge it but just revert the changes to the proj files.
You could cherrypick the two commits that don't mess with the proj files as well
There's a lot of weird intending issues in a lot of the files as well
There's a lot of weird intending issues in a lot of the files as well
I think @ErikDombi means "indenting issues" as in whitespace, and formatting is all over the place 😄
Is there any status on this? Eager to try out this functionality
I have this and token is printed empty, can someone help me?
var functionParameters = new{
type = "object",
properties = new
{
scary = new
{
type = "string",
description = "A scary version of the response to a user's query"
},
joyful = new
{
type = "string",
description = "A joyful version of the response to a user's query"
}
},
required = new[]
{
"scary",
"joyful"
}
};
var functions = new List<Function> { new Function(
"responses",
"ingest the emotion",
functionParameters
)};
var functionCall = new FunctionCall
{
Name = "responses" ,
// Arguments = "scary"
};
// Send the entire chat to OpenAI to get the next message
await foreach (var token in api.Chat.StreamChatEnumerableAsync(new ChatRequest()
{
Model = Model.ChatGPTTurbo,
Temperature = 0.5,
Messages = messages,
Functions = functions,
FunctionCall = functionCall
}))
{
Console.WriteLine("token ", token);
.......
Looks like PR got killed. Can we get it back?
did it get merged?
@EvoTechMike, no, I haven't merged any PR's in a while. I implemented some of this myself but I have not yet implemented the function calling.
Function calling is the only reason I can't use this beautiful package. Please implement it.
@OkGoDoIt If you don't have time to implement or approve features by yourself why you don't let your contributors do the job? Still if a community implementation doesn't reach your quality standards and will be replaced by your own ones later on it would be a much better progress. Right now we have a feature lack which grows from day to day. I understand that you have your own personal goals but on a project that became so famous there is a requirement to be up to date and feature complete. So please rethink about your future project management.
Interim one of the forks supporting Functions, Assistants, and other new features can be used.
https://github.com/RageAgainstThePixel/OpenAI-DotNet -Supports Unity, .NET Standard & .NET Core, frequently updated, great test coverage. https://github.com/lofcz/OpenAiNg - .NET Core only, high performance, customizability, supports locally hosted models, responses include raw HTTP request. A high-level API for using functions and streaming together is included. (Ordered by ⭐️ at the time of writing)
Thanks
@lofcz Out of curiosity, how did you find these forks? When I go to https://github.com/OkGoDoIt/OpenAI-API-dotnet/forks there isn't any forks with more than 1 star.
edit: d'oh! I see one of them is yours. Still wondering why they dont show up as forks though
Both RageAgainstThePixel and me hardforked some time ago. I've maintained a soft fork for about half a year but then the changes diverged too much from upstream, hence hard fork.
I suggest archive this repository if the author no longer maintain this feature. It is funny that such feature is not merged after 8 months.
This commit adds in function support as described in the latest June 13th update. It also adds the new models which support functions. This closes #146