KeyserDSoze / Rystem.OpenAi

.Net wrapper for OpenAI with Dependency injection integration, factory integration: you may inject more than one endpoint, azure integration: you may swap among openai endpoint and any azure endpoint quickly and easily. You can calculate tokens and cost for each request (before the request) and for each response.
MIT License
90 stars 11 forks source link

Tool_choice error on Chat requests #82

Open xrmisaac opened 2 months ago

xrmisaac commented 2 months ago

Describe the bug

I'm getting an error returned when attempting a chat request.

System.Net.Http.HttpRequestException : { "error": { "message": "Invalid value for 'tool_choice': 'tool_choice' is only allowed when 'tools' are specified.", "type": "invalid_request_error", "param": "tool_choice", "code": null } }

I'm guessing the tool calls are a mandatory object on the request, I can get requests to GPT-4 working by having an empty function defined and attaching it with .WithFunction(factory.NullFunction) but given that 3.5 doesn't support functions I don't know how I'm supposed to make requests to that model.

To Reproduce

Use the example code from https://github.com/KeyserDSoze/Rystem.OpenAi?tab=readme-ov-file#chat

Code snippets

string apiKey = "sk-###############################"

OpenAiService.Instance.AddOpenAi(settings =>
{
    settings.ApiKey = apiKey;
}, "NoDI");

var openAiApi = OpenAiService.Factory.Create("NoDI");
var results = await openAiApi.Chat
        .Request(new ChatMessage { Role = ChatRole.User, Content = "Hello!! How are you?" })
        .WithModel(ChatModelType.Gpt4)
        .WithTemperature(1)
        .ExecuteAsync();

OS

Windows

.Net version

.Net 6.0

Library version

3.3.12

aallfredo commented 2 months ago

I have similar problems I started using the library yesterday

MarioMartinPlaza commented 2 months ago

+1 i have the same exception

AleksandrFurmenkovOfficial commented 2 months ago

Use the fix in ChatRequest.cs

using System.Text.Json.Serialization;

namespace Rystem.OpenAi.Chat
{
    /// <summary>
    /// Represents a request to the chat API.
    /// </summary>
    public sealed class ChatRequest : IOpenAiRequest
    {
        [JsonPropertyName("model")]
        public string? ModelId { get; set; }
        [JsonPropertyName("messages")]
        public List<ChatMessage>? Messages { get; set; }
        [JsonPropertyName("temperature")]
        public double? Temperature { get; set; }
        [JsonPropertyName("top_p")]
        public double? TopP { get; set; }
        [JsonPropertyName("stream")]
        public bool Stream { get; internal set; } = false;
        [JsonPropertyName("stop")]
        public object? StopSequence { get; set; }
        [JsonPropertyName("max_tokens")]
        public int? MaxTokens { get; set; }
        [JsonPropertyName("presence_penalty")]
        public double? PresencePenalty { get; set; }
        [JsonPropertyName("frequency_penalty")]
        public double? FrequencyPenalty { get; set; }
        [JsonPropertyName("n")]
        public int? NumberOfChoicesPerPrompt { get; set; }
        [JsonPropertyName("logit_bias")]
        public Dictionary<string, int>? Bias { get; set; }
        [JsonPropertyName("user")]
        public string? User { get; set; }
        /// <summary>
        /// This feature is in Beta. If specified, our system will make a best effort to sample deterministically,
        /// such that repeated requests with the same seed and parameters should return the same result. 
        /// Determinism is not guaranteed, and you should refer to the system_fingerprint response parameter to monitor changes in the backend.
        /// </summary>
        [JsonPropertyName("seed")]
        public int? Seed { get; set; }
        /// <summary>
        /// Controls which (if any) function is called by the model. none means the model will not call a function and instead generates a message.
        /// auto means the model can pick between generating a message or calling a function. 
        /// Specifying a particular function via {"type: "function", "function": {"name": "my_function"}} forces the model to call that function. 
        /// none is the default when no functions are present.auto is the default if functions are present.
        /// </summary>
        [JsonIgnore]
        [JsonPropertyName("tool_choice")]
        public object? ToolChoice { get; set; }
        /// <summary>
        /// A list of tools the model may call. Currently, only functions are supported as a tool. 
        /// Use this to provide a list of functions the model may generate JSON inputs for.
        /// </summary>
        [JsonPropertyName("tools")]
        public List<object>? Tools { get; set; }

        public bool ShouldSerializeToolChoice()
        {
            return Tools != null && Tools.Count > 0;
        }
    }
}
AutreMachine commented 2 months ago

Same problem here

nluo1201 commented 1 month ago

If you aren't using function calling or tools/tool_choice, simply decorate [JsonIgnore] on both fields would temporary fix this error. `[JsonPropertyName("tool_choice")] [JsonIgnore] public object? ToolChoice { get; set; }

[JsonPropertyName("tools")] [JsonIgnore] public List? Tools { get; set; }`

NormTheThird commented 1 month ago

@xrmisaac "I can get requests to GPT-4 working by having an empty function defined and attaching it with .WithFunction(factory.NullFunction)" Can you show the code that you used to get this working with GPT-4? Thanks.