HemulGM / DelphiOpenAI

OpenAI API wrapper for Delphi. Use ChatGPT, DALL-E, Whisper and other products.
MIT License
241 stars 58 forks source link

OpenAI.Chat update to use "functions" #18

Closed MaxiDonkey closed 1 year ago

MaxiDonkey commented 1 year ago

Here is a modification of the OpenAI.Chat unit in order to handle functions with 0613 models.

unit OpenAI.Chat;

interface

uses
  System.SysUtils, OpenAI.API.Params, OpenAI.API, System.Classes;

{$SCOPEDENUMS ON}

type
  TMessageRole = (System, User, Assistant, Fonction);

  TMessageRoleHelper = record helper for TMessageRole
    function ToString: string;
    class function FromString(const Value: string): TMessageRole; static;
  end;

  TChatMessageBuild = record
  private
    FRole: TMessageRole;
    FContent: string;
    FFunction_call: string;
    FArguments: string;
    FTag: string;
    FName: string;
  public
    /// <summary>
    /// The role of the author of this message. One of system, user, or assistant.
    /// </summary>
    property Role: TMessageRole read FRole write FRole;
    /// <summary>
    /// The contents of the message.
    /// </summary>
    property Content: string read FContent write FContent;
    /// <summary>
    /// The function call of the message.
    /// </summary>
    property function_call: string read FFunction_call write FFunction_call;
    /// <summary>
    /// The arguments of the function called.
    /// </summary>
    property Arguments: string read FArguments write FArguments;
    /// <summary>
    /// The name of this message. May contain a-z, A-Z, 0-9, and underscores, with a maximum length of 64 characters.
    /// </summary>
    property Name: string read FName write FName;
    /// <summary>
    /// Tag - custom field for convenience. Not used in requests!
    /// </summary>
    property Tag: string read FTag write FTag;
    class function Create(Role: TMessageRole; const Content: string; const Name: string = ''): TChatMessageBuild; static;
    class function User(const Content: string; const Name: string = ''): TChatMessageBuild; static;
    class function System(const Content: string; const Name: string = ''): TChatMessageBuild; static;
    class function Assistant(const Content: string; const Name: string = ''): TChatMessageBuild; static;
    class function AssistantFunc(const Function_Name: string; const Arguments: string): TChatMessageBuild; static;
    class function Fonction(const Content: string; const Name: string = ''): TChatMessageBuild; static;
  end;

  TChatFonctionBuild = record
  private
    FName: string;
    FDescription: string;
    FParameters: string;
  public
    /// <summary>
    /// The name this function. May contain a-z, A-Z, 0-9, and underscores, with a maximum length of 64 characters.
    /// </summary>
    property Name: string read FName write FName;
    /// <summary>
    /// The description of this function.
    /// </summary>
    property Description: string read FDescription write FDescription;
    /// <summary>
    /// The parameters of this function.
    /// </summary>
    property Parameters: string read FParameters write FParameters;
    class function Fonction(const Name: string; const Description: string; const Arguments: string): TChatFonctionBuild; static;
  end;

  TChatParams = class(TJSONParam)
    /// <summary>
    /// ID of the model to use. Currently, gpt-3.5-turbo and gpt-4 are supported.
    /// TODO : définir tous les modèles actuels
    /// </summary>
    function Model(const Value: string): TChatParams;
    /// <summary>
    /// The messages to generate chat completions for, in the chat format.
    /// </summary>
    function Messages(const Value: TArray<TChatMessageBuild>): TChatParams; overload;
    /// <summary>
    /// The functions of chat completions for, in the chat format.
    /// </summary>
    function Fonctions(const Value: TArray<TChatFonctionBuild>): TChatParams;
    /// <summary>
    /// What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random,
    /// while lower values like 0.2 will make it more focused and deterministic.
    /// We generally recommend altering this or top_p but not both.
    /// </summary>
    function Temperature(const Value: Single = 1): TChatParams;
    /// <summary>
    /// An alternative to sampling with temperature, called nucleus sampling, where the model considers the
    /// results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10%
    /// probability mass are considered.
    /// We generally recommend altering this or temperature but not both.
    /// </summary>
    function TopP(const Value: Single = 1): TChatParams;
    /// <summary>
    /// How many chat completion choices to generate for each input message.
    /// </summary>
    function N(const Value: Integer = 1): TChatParams;
    /// <summary>
    /// If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as
    /// data-only server-sent events as they become available, with the stream terminated by a data: [DONE] message.
    /// </summary>
    function Stream(const Value: Boolean = True): TChatParams;
    /// <summary>
    /// Up to 4 sequences where the API will stop generating further tokens.
    /// </summary>
    function Stop(const Value: string): TChatParams; overload;
    /// <summary>
    /// Up to 4 sequences where the API will stop generating further tokens.
    /// </summary>
    function Stop(const Value: TArray<string>): TChatParams; overload;
    /// <summary>
    /// The maximum number of tokens allowed for the generated answer. By default, the number of
    /// tokens the model can return will be (4096 - prompt tokens).
    /// </summary>
    function MaxTokens(const Value: Integer = 16): TChatParams;
    /// <summary>
    /// Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far,
    /// increasing the model's likelihood to talk about new topics.
    /// </summary>
    function PresencePenalty(const Value: Single = 0): TChatParams;
    /// <summary>
    /// Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far,
    /// decreasing the model's likelihood to repeat the same line verbatim.
    /// </summary>
    function FrequencyPenalty(const Value: Single = 0): TChatParams;
    /// <summary>
    /// A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse.
    /// </summary>
    function User(const Value: string): TChatParams;
    constructor Create; override;
  end;

  TChatUsage = class
  private
    FCompletion_tokens: Int64;
    FPrompt_tokens: Int64;
    FTotal_tokens: Int64;
  public
    property CompletionTokens: Int64 read FCompletion_tokens write FCompletion_tokens;
    property PromptTokens: Int64 read FPrompt_tokens write FPrompt_tokens;
    property TotalTokens: Int64 read FTotal_tokens write FTotal_tokens;
  end;

  TFunction_call = class
  private
    FName: string;
    FArguments: string;
  public
    property Name: string read FName write FName;
    property Arguments: string read FArguments write FArguments;
  end;

  TChatMessage = class
  private
    FRole: string;
    FContent: string;
    FFunction_call: TFunction_call;
  public
    property Role: string read FRole write FRole;
    property Content: string read FContent write FContent;
    property Function_call: TFunction_call read FFunction_call write FFunction_call;
    destructor Destroy; override;
  end;

  TChatChoices = class
  private
    FIndex: Int64;
    FMessage: TChatMessage;
    FFinish_reason: string;
    FDelta: TChatMessage;
  public
    property Index: Int64 read FIndex write FIndex;
    property Message: TChatMessage read FMessage write FMessage;
    property Delta: TChatMessage read FDelta write FDelta;
    /// <summary>
    /// The possible values for finish_reason are:
    /// stop: API returned complete model output
    /// length: Incomplete model output due to max_tokens parameter or token limit
    /// content_filter: Omitted content due to a flag from our content filters
    /// function_call:
    /// null: API response still in progress or incomplete
    /// </summary>
    property FinishReason: string read FFinish_reason write FFinish_reason;
    destructor Destroy; override;
  end;

  TChat = class
  private
    FChoices: TArray<TChatChoices>;
    FCreated: Int64;
    FId: string;
    FObject: string;
    FUsage: TChatUsage;
  public
    property Id: string read FId write FId;
    property &Object: string read FObject write FObject;
    property Created: Int64 read FCreated write FCreated;
    property Choices: TArray<TChatChoices> read FChoices write FChoices;
    property Usage: TChatUsage read FUsage write FUsage;
    destructor Destroy; override;
  end;

  TChatEvent = reference to procedure(Chat: TChat; IsDone: Boolean; var Cancel: Boolean);
  /// <summary>
  /// Given a chat conversation, the model will return a chat completion response.
  /// </summary>

  TChatRoute = class(TOpenAIAPIRoute)
  public
    /// <summary>
    /// Creates a completion for the chat message
    /// </summary>
    function Create(ParamProc: TProc<TChatParams>): TChat;
    /// <summary>
    /// Creates a completion for the chat message
    /// </summary>
    function CreateStream(ParamProc: TProc<TChatParams>; Event: TChatEvent): Boolean;
  end;

implementation

uses
  System.JSON, Rest.Json;

{ TChatRoute }

function TChatRoute.Create(ParamProc: TProc<TChatParams>): TChat;
begin
  Result := API.Post<TChat, TChatParams>('chat/completions', ParamProc);
end;

function TChatRoute.CreateStream(ParamProc: TProc<TChatParams>; Event: TChatEvent): Boolean;
var
  Response: TStringStream;
  RetPos: Integer;
begin
  Response := TStringStream.Create('', TEncoding.UTF8);
  try
    RetPos := 0;
    Result := API.Post<TChatParams>('chat/completions', ParamProc, Response,
      procedure(const Sender: TObject; AContentLength: Int64; AReadCount: Int64; var AAbort: Boolean)
      var
        IsDone: Boolean;
        Data: string;
        Chat: TChat;
        TextBuffer: string;
        Line: string;
        Ret: Integer;
      begin
        TextBuffer := Response.DataString;
        repeat
          Ret := TextBuffer.IndexOf(#10, RetPos);
          if Ret >= 0 then
          begin
            Line := TextBuffer.Substring(RetPos, Ret - RetPos);
            RetPos := Ret + 1;
            if Line.IsEmpty or (Line.StartsWith(#10)) then
              Continue;
            Chat := nil;
            Data := Line.Replace('data: ', '').Trim([' ', #13, #10]);
            IsDone := Data = '[DONE]';
            if not IsDone then
            begin
              try
                Chat := TJson.JsonToObject<TChat>(Data);
              except
                Chat := nil;
              end;
            end;
            try
              Event(Chat, IsDone, AAbort);
            finally
              if Assigned(Chat) then
                Chat.Free;
            end;
          end;
        until Ret < 0;
      end);
  finally
    Response.Free;
  end;
end;

{ TChat }

destructor TChat.Destroy;
begin
  if Assigned(FUsage) then
    FUsage.Free;
  for var Item in FChoices do
    if Assigned(Item) then
      Item.Free;
  inherited;
end;

{ TChatParams }

constructor TChatParams.Create;
begin
  inherited;
//  Model('gpt-3.5-turbo');
  Model('gpt-3.5-turbo-0613');
//  Model('gpt-3.5-turbo-16k');
end;

function TChatParams.Fonctions(
  const Value: TArray<TChatFonctionBuild>): TChatParams;
var
  Item: TChatFonctionBuild;
  JSON: TJSONObject;
  Items: TJSONArray;
begin
  Items := TJSONArray.Create;
  for Item in Value do
  begin
    JSON := TJSONObject.Create;
    JSON.AddPair('name', Item.Name);
    JSON.AddPair('description', Item.Description);
    JSON.AddPair('parameters', TJSONObject.ParseJSONValue(Item.Parameters));
    Items.Add(JSON);
  end;
  Result := TChatParams(Add('functions', Items));
end;

function TChatParams.FrequencyPenalty(const Value: Single): TChatParams;
begin
  Result := TChatParams(Add('frequency_penalty', Value));
end;

function TChatParams.MaxTokens(const Value: Integer): TChatParams;
begin
  Result := TChatParams(Add('max_tokens', Value));
end;

function TChatParams.Model(const Value: string): TChatParams;
begin
  Result := TChatParams(Add('model', Value));
end;

function TChatParams.N(const Value: Integer): TChatParams;
begin
  Result := TChatParams(Add('n', Value));
end;

function TChatParams.PresencePenalty(const Value: Single): TChatParams;
begin
  Result := TChatParams(Add('presence_penalty', Value));
end;

function TChatParams.Messages(const Value: TArray<TChatMessageBuild>): TChatParams;
var
  Item: TChatMessageBuild;
  JSON: TJSONObject;
  Items: TJSONArray;
begin
  Items := TJSONArray.Create;
  for Item in Value do
  begin
    JSON := TJSONObject.Create;
    JSON.AddPair('role', Item.Role.ToString);
    if Item.Content <> 'null' then JSON.AddPair('content', Item.Content)
      else begin
        JSON.AddPair('content', TJSONNull.Create);
        var LFonction := TJSONObject.Create(
          TJSONPair.Create('name', Item.FFunction_call));
        LFonction.AddPair('arguments', Format('{ %s}', [Item.Arguments]));
        JSON.AddPair('function_call', LFonction);
      end;
    if not Item.Name.IsEmpty then
      JSON.AddPair('name', Item.Name);
    Items.Add(JSON);
  end;
  Result := TChatParams(Add('messages', Items));
end;

function TChatParams.Stop(const Value: TArray<string>): TChatParams;
begin
  Result := TChatParams(Add('stop', Value));
end;

function TChatParams.Stop(const Value: string): TChatParams;
begin
  Result := TChatParams(Add('stop', Value));
end;

function TChatParams.Stream(const Value: Boolean): TChatParams;
begin
  Result := TChatParams(Add('stream', Value));
end;

function TChatParams.Temperature(const Value: Single): TChatParams;
begin
  Result := TChatParams(Add('temperature', Value));
end;

function TChatParams.TopP(const Value: Single): TChatParams;
begin
  Result := TChatParams(Add('top_p', Value));
end;

function TChatParams.User(const Value: string): TChatParams;
begin
  Result := TChatParams(Add('user', Value));
end;

{ TChatMessageBuild }

class function TChatMessageBuild.Assistant(const Content: string; const Name: string = ''): TChatMessageBuild;
begin
  Result.FRole := TMessageRole.Assistant;
  Result.FContent := Content;
  Result.FName := Name;
end;

class function TChatMessageBuild.AssistantFunc(const Function_Name: string;
  const Arguments: string): TChatMessageBuild;
begin
  Result.FRole := TMessageRole.Assistant;
  Result.FContent := 'null';
  Result.function_call := Function_Name;
  Result.Arguments := Arguments;
end;

class function TChatMessageBuild.Create(Role: TMessageRole; const Content: string; const Name: string = ''): TChatMessageBuild;
begin
  Result.FRole := Role;
  Result.FContent := Content;
  Result.FName := Name;
end;

class function TChatMessageBuild.Fonction(const Content,
  Name: string): TChatMessageBuild;
begin
  Result.FRole := TMessageRole.Fonction;
  Result.FContent := Content;
  Result.Name := Name;
end;

class function TChatMessageBuild.System(const Content: string; const Name: string = ''): TChatMessageBuild;
begin
  Result.FRole := TMessageRole.System;
  Result.FContent := Content;
  Result.FName := Name;
end;

class function TChatMessageBuild.User(const Content: string; const Name: string = ''): TChatMessageBuild;
begin
  Result.FRole := TMessageRole.User;
  Result.FContent := Content;
  Result.FName := Name;
end;

{ TMessageRoleHelper }

class function TMessageRoleHelper.FromString(const Value: string): TMessageRole;
begin
  if Value = 'system' then
    Exit(TMessageRole.System)
  else if Value = 'user' then
    Exit(TMessageRole.User)
  else if Value = 'assistant' then
    Exit(TMessageRole.Assistant)
  else if Value = 'function' then
    Exit(TMessageRole.Fonction)
  else
    Result := TMessageRole.User;
end;

function TMessageRoleHelper.ToString: string;
begin
  case Self of
    TMessageRole.System:
      Result := 'system';
    TMessageRole.User:
      Result := 'user';
    TMessageRole.Assistant:
      Result := 'assistant';
    TMessageRole.Fonction:
      Result := 'function';
  end;
end;

{ TChatChoices }

destructor TChatChoices.Destroy;
begin
  if Assigned(FMessage) then
    FMessage.Free;
  if Assigned(FDelta) then
    FDelta.Free;
  inherited;
end;

{ TChatFonctionBuild }

class function TChatFonctionBuild.Fonction(const Name: string;
  const Description: string; const Arguments: string): TChatFonctionBuild;
begin
  Result.Name := Name;
  Result.Description := Description;
  Result.Parameters := Arguments;
end;

{ TChatMessage }

destructor TChatMessage.Destroy;
begin
  if Assigned(FFunction_call) then
    FFunction_call.Free;
  inherited;
end;

end.

Use example from https://openai.com/blog/function-calling-and-other-api-updates

function script: string; begin with TStringWriter.Create do try WriteLine('{'); WriteLine(' "type": "object",'); WriteLine(' "properties": {'); WriteLine(' "location": {'); WriteLine(' "type": "string",'); WriteLine(' "description": "La ville ou la région, par ex. Paris, Île-de-France"},'); WriteLine(' "unit": {'); WriteLine(' "type": "string",'); WriteLine(' "enum": ["celsius", "fahrenheit"]}},'); WriteLine(' "required": ["location"]'); WriteLine('}'); Result := ToString; finally Free; end; end;

1-------------- var OpenAI: IOpenAI := TOpenAI.Create(KEY);

try var Chat := OpenAI.Chat.Create( procedure(Params: TChatParams) begin Params.Messages([ TchatMessageBuild.System('L''heure actuelle est '+TimeToStr(Now)+'.'), TchatMessageBuild.User(Memo1.Text) // TchatMessageBuild.AssistantFunc('get_current_weather', '"location": "Boston, MA"'), // TchatMessageBuild.Fonction('"temperature": "22", "unit": "celsius", "description": "Sunny"','get_current_weather') ]); Params.Fonctions([ TChatFonctionBuild.Fonction('get_current_weather', 'Obtenir la météo actuelle à un endroit donné', Script) ]); Params.MaxTokens(1024); end); try for var Choice in Chat.Choices do begin try Memo1.Lines.Add(Choice.Message.Function_call.Name); Memo1.Lines.Add(Choice.Message.Function_call.Arguments); except end; Memo1.Lines.Add(Choice.Message.Content); end; finally Chat.Free; end; except Memo1.Text := 'error'; Raise; end;

2------ var Function_call: string; Arguments: string; begin var OpenAI: IOpenAI := TOpenAI.Create(KEY);

var Chat := OpenAI.Chat.CreateStream( procedure(Params: TChatParams) begin Params.Messages([ TchatMessageBuild.System('L''heure actuelle est '+TimeToStr(Now)), TchatMessageBuild.User(Memo1.Text) ]); Params.Fonctions([ TChatFonctionBuild.Fonction('get_current_weather', 'Obtenir la météo actuelle à un endroit donné', Script) ]); Params.MaxTokens(1024); Params.Stream; end, procedure(Chat: TChat; IsDone: Boolean; var Cancel: Boolean) begin if (not IsDone) and Assigned(Chat) then begin Memo1.Text := Memo1.Text + Chat.Choices[0].Delta.Content; try if Function_call = EmptyStr then Function_call := Chat.Choices[0].Delta.Function_call.Name; Arguments := Arguments + Chat.Choices[0].Delta.Function_call.Arguments; except end; Application.ProcessMessages; end else if IsDone then begin Memo1.Text := Memo1.Text + #13; end; Sleep(100); end); if Function_call <> EmptyStr then Memo1.Lines.Add(Function_call); if Arguments <> EmptyStr then Memo1.Lines.Add(Arguments); end;

HemulGM commented 1 year ago

Very cool, thanks. I'll update tomorrow or the day after tomorrow.

MaxiDonkey commented 1 year ago

Here is a cleaner version of the "OpenAI.Chat" unit as well as a unit to easily test the 0613 models. "OpenAI.Chat0613", allows to manage a catalog of functions as well as the internal or external execution of each function. Hope this can help this project.

unit OpenAI.Chat;

interface

uses
  System.SysUtils, OpenAI.API.Params, OpenAI.API, System.Classes, System.JSON;

{$SCOPEDENUMS ON}

type
  TMessageRole = (System, User, Assistant, Fonction);

  TMessageRoleHelper = record helper for TMessageRole
    function ToString: string;
    class function FromString(const Value: string): TMessageRole; static;
  end;

  /// <summary>
  /// Build function body as text.
  /// </summary>
  TBodyBuild = reference to function: string;

  TChatMessageBuild = record
  private
    FRole: TMessageRole;
    FContent: string;
    FFunction_call: string;
    FArguments: string;
    FTag: string;
    FName: string;
  public
    /// <summary>
    /// The role of the author of this message. One of system, user, or assistant.
    /// </summary>
    property Role: TMessageRole read FRole write FRole;
    /// <summary>
    /// The contents of the message.
    /// </summary>
    property Content: string read FContent write FContent;
    /// <summary>
    /// The function call of the message.
    /// </summary>
    property function_call: string read FFunction_call write FFunction_call;
    /// <summary>
    /// The arguments of the function called.
    /// </summary>
    property Arguments: string read FArguments write FArguments;
    /// <summary>
    /// The name of this message. May contain a-z, A-Z, 0-9, and underscores, with a maximum length of 64 characters.
    /// </summary>
    property Name: string read FName write FName;
    /// <summary>
    /// Tag - custom field for convenience. Not used in requests!
    /// </summary>
    property Tag: string read FTag write FTag;
    class function Create(Role: TMessageRole; const Content: string; const Name: string = ''): TChatMessageBuild; static;
    class function User(const Content: string; const Name: string = ''): TChatMessageBuild; static;
    class function System(const Content: string; const Name: string = ''): TChatMessageBuild; overload; static;
    class function System(const Content: TBodyBuild; const Name: string = ''): TChatMessageBuild; overload; static;
    class function Assistant(const Content: string; const Name: string = ''): TChatMessageBuild; static;
    class function AssistantFunc(const Function_Name: string; const Arguments: string): TChatMessageBuild; static;
    class function Fonction(const Content: string; const Name: string): TChatMessageBuild; static;
  end;

  TChatFonctionBuild = record
  private
    FName: string;
    FDescription: string;
    FParameters: string;
  public
    /// <summary>
    /// The name this function. May contain a-z, A-Z, 0-9, and underscores, with a maximum length of 64 characters.
    /// </summary>
    property Name: string read FName write FName;
    /// <summary>
    /// The description of this function.
    /// </summary>
    property Description: string read FDescription write FDescription;
    /// <summary>
    /// The parameters of this function.
    /// </summary>
    property Parameters: string read FParameters write FParameters;
    class function Fonction(const Name: string; const Description: string;
      const ArgFunc: TBodyBuild): TChatFonctionBuild; overload; static;
    class function Fonction(const Name: string; const Description: string;
      const Arguments: string): TChatFonctionBuild; overload; static;
  end;

  TChatParams = class(TJSONParam)
    /// <summary>
    /// ID of the model to use. Currently, gpt-3.5-turbo and gpt-4 are supported.
    /// TODO : définir tous les modèles actuels
    /// </summary>
    function Model(const Value: string): TChatParams;
    /// <summary>
    /// The messages to generate chat completions for, in the chat format.
    /// </summary>
    function Messages(const Value: TArray<TChatMessageBuild>): TChatParams; overload;
    /// <summary>
    /// The functions of chat completions for, in the chat format.
    /// </summary>
    function Fonctions(const Value: TArray<TChatFonctionBuild>): TChatParams; overload;
    /// <summary>
    /// When the query includes several functions. You have to build the list of functions independently.
    /// Use with OpenAI.Chat0613
    /// </summary>
    function Fonctions(const Value: TJSONArray): TChatParams; overload;
    /// <summary>
    /// What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random,
    /// while lower values like 0.2 will make it more focused and deterministic.
    /// We generally recommend altering this or top_p but not both.
    /// </summary>
    function Temperature(const Value: Single = 1): TChatParams;
    /// <summary>
    /// An alternative to sampling with temperature, called nucleus sampling, where the model considers the
    /// results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10%
    /// probability mass are considered.
    /// We generally recommend altering this or temperature but not both.
    /// </summary>
    function TopP(const Value: Single = 1): TChatParams;
    /// <summary>
    /// How many chat completion choices to generate for each input message.
    /// </summary>
    function N(const Value: Integer = 1): TChatParams;
    /// <summary>
    /// If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as
    /// data-only server-sent events as they become available, with the stream terminated by a data: [DONE] message.
    /// </summary>
    function Stream(const Value: Boolean = True): TChatParams;
    /// <summary>
    /// Up to 4 sequences where the API will stop generating further tokens.
    /// </summary>
    function Stop(const Value: string): TChatParams; overload;
    /// <summary>
    /// Up to 4 sequences where the API will stop generating further tokens.
    /// </summary>
    function Stop(const Value: TArray<string>): TChatParams; overload;
    /// <summary>
    /// The maximum number of tokens allowed for the generated answer. By default, the number of
    /// tokens the model can return will be (4096 - prompt tokens).
    /// </summary>
    function MaxTokens(const Value: Integer = 16): TChatParams;
    /// <summary>
    /// Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far,
    /// increasing the model's likelihood to talk about new topics.
    /// </summary>
    function PresencePenalty(const Value: Single = 0): TChatParams;
    /// <summary>
    /// Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far,
    /// decreasing the model's likelihood to repeat the same line verbatim.
    /// </summary>
    function FrequencyPenalty(const Value: Single = 0): TChatParams;
    /// <summary>
    /// A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse.
    /// </summary>
    function User(const Value: string): TChatParams;
    constructor Create; override;
  end;

  TChatUsage = class
  private
    FCompletion_tokens: Int64;
    FPrompt_tokens: Int64;
    FTotal_tokens: Int64;
  public
    property CompletionTokens: Int64 read FCompletion_tokens write FCompletion_tokens;
    property PromptTokens: Int64 read FPrompt_tokens write FPrompt_tokens;
    property TotalTokens: Int64 read FTotal_tokens write FTotal_tokens;
  end;

  TFunction_call = class
  private
    FName: string;
    FArguments: string;
  public
    property Name: string read FName write FName;
    property Arguments: string read FArguments write FArguments;
  end;

  TChatMessage = class
  private
    FRole: string;
    FContent: string;
    FFunction_call: TFunction_call;
  public
    property Role: string read FRole write FRole;
    property Content: string read FContent write FContent;
    property Function_call: TFunction_call read FFunction_call write FFunction_call;
    destructor Destroy; override;
  end;

  TChatChoices = class
  private
    FIndex: Int64;
    FMessage: TChatMessage;
    FFinish_reason: string;
    FDelta: TChatMessage;
  public
    property Index: Int64 read FIndex write FIndex;
    property Message: TChatMessage read FMessage write FMessage;
    property Delta: TChatMessage read FDelta write FDelta;
    /// <summary>
    /// The possible values for finish_reason are:
    /// stop: API returned complete model output
    /// length: Incomplete model output due to max_tokens parameter or token limit
    /// content_filter: Omitted content due to a flag from our content filters
    /// function_call: Returned when a function is detected
    /// null: API response still in progress or incomplete
    /// </summary>
    property FinishReason: string read FFinish_reason write FFinish_reason;
    destructor Destroy; override;
  end;

  TChat = class
  private
    FChoices: TArray<TChatChoices>;
    FCreated: Int64;
    FId: string;
    FObject: string;
    FUsage: TChatUsage;
  public
    property Id: string read FId write FId;
    property &Object: string read FObject write FObject;
    property Created: Int64 read FCreated write FCreated;
    property Choices: TArray<TChatChoices> read FChoices write FChoices;
    property Usage: TChatUsage read FUsage write FUsage;
    destructor Destroy; override;
  end;

  TChatEvent = reference to procedure(Chat: TChat; IsDone: Boolean; var Cancel: Boolean);
  /// <summary>
  /// Given a chat conversation, the model will return a chat completion response.
  /// </summary>

  TChatRoute = class(TOpenAIAPIRoute)
  public
    /// <summary>
    /// Creates a completion for the chat message
    /// </summary>
    function Create(ParamProc: TProc<TChatParams>): TChat;
    /// <summary>
    /// Creates a completion for the chat message
    /// </summary>
    function CreateStream(ParamProc: TProc<TChatParams>; Event: TChatEvent): Boolean;
  end;

implementation

uses
  Rest.Json;

{ TChatRoute }

function TChatRoute.Create(ParamProc: TProc<TChatParams>): TChat;
begin
  Result := API.Post<TChat, TChatParams>('chat/completions', ParamProc);
end;

function TChatRoute.CreateStream(ParamProc: TProc<TChatParams>; Event: TChatEvent): Boolean;
var
  Response: TStringStream;
  RetPos: Integer;
begin
  Response := TStringStream.Create('', TEncoding.UTF8);
  try
    RetPos := 0;
    Result := API.Post<TChatParams>('chat/completions', ParamProc, Response,
      procedure(const Sender: TObject; AContentLength: Int64; AReadCount: Int64; var AAbort: Boolean)
      var
        IsDone: Boolean;
        Data: string;
        Chat: TChat;
        TextBuffer: string;
        Line: string;
        Ret: Integer;
      begin
        TextBuffer := Response.DataString;
        repeat
          Ret := TextBuffer.IndexOf(#10, RetPos);
          if Ret >= 0 then
          begin
            Line := TextBuffer.Substring(RetPos, Ret - RetPos);
            RetPos := Ret + 1;
            if Line.IsEmpty or (Line.StartsWith(#10)) then
              Continue;
            Chat := nil;
            Data := Line.Replace('data: ', '').Trim([' ', #13, #10]);
            IsDone := Data = '[DONE]';
            if not IsDone then
            begin
              try
                Chat := TJson.JsonToObject<TChat>(Data);
              except
                Chat := nil;
              end;
            end;
            try
              Event(Chat, IsDone, AAbort);
            finally
              if Assigned(Chat) then
                Chat.Free;
            end;
          end;
        until Ret < 0;
      end);
  finally
    Response.Free;
  end;
end;

{ TChat }

destructor TChat.Destroy;
begin
  if Assigned(FUsage) then
    FUsage.Free;
  for var Item in FChoices do
    if Assigned(Item) then
      Item.Free;
  inherited;
end;

{ TChatParams }

constructor TChatParams.Create;
begin
  inherited;
  Model('gpt-3.5-turbo');
end;

function TChatParams.Fonctions(
  const Value: TArray<TChatFonctionBuild>): TChatParams;
var
  JSON: TJSONObject;
  Items: TJSONArray;
begin
  Items := TJSONArray.Create;
  for var Item in Value do
  begin
    JSON := TJSONObject.Create;
    JSON.AddPair('name', Item.Name);
    JSON.AddPair('description', Item.Description);
    JSON.AddPair('parameters', TJSONObject.ParseJSONValue(Item.Parameters));
    Items.Add(JSON);
  end;
  Result := TChatParams(Add('functions', Items));
end;

function TChatParams.Fonctions(const Value: TJSONArray): TChatParams;
begin
  Result := TChatParams(Add('functions', Value));
end;

function TChatParams.FrequencyPenalty(const Value: Single): TChatParams;
begin
  Result := TChatParams(Add('frequency_penalty', Value));
end;

function TChatParams.MaxTokens(const Value: Integer): TChatParams;
begin
  Result := TChatParams(Add('max_tokens', Value));
end;

function TChatParams.Model(const Value: string): TChatParams;
begin
  Result := TChatParams(Add('model', Value));
end;

function TChatParams.N(const Value: Integer): TChatParams;
begin
  Result := TChatParams(Add('n', Value));
end;

function TChatParams.PresencePenalty(const Value: Single): TChatParams;
begin
  Result := TChatParams(Add('presence_penalty', Value));
end;

function TChatParams.Messages(const Value: TArray<TChatMessageBuild>): TChatParams;
var
  Item: TChatMessageBuild;
  JSON: TJSONObject;
  Items: TJSONArray;
begin
  Items := TJSONArray.Create;
  for Item in Value do
  begin
    JSON := TJSONObject.Create;
    JSON.AddPair('role', Item.Role.ToString);
    if Item.Content <> 'null' then JSON.AddPair('content', Item.Content)
      else begin
        JSON.AddPair('content', TJSONNull.Create);
        var LFonction := TJSONObject.Create(
          TJSONPair.Create('name', Item.FFunction_call));
        LFonction.AddPair('arguments', Item.Arguments);
        JSON.AddPair('function_call', LFonction);
      end;
    if not Item.Name.IsEmpty then
      JSON.AddPair('name', Item.Name);
    Items.Add(JSON);
  end;
  Result := TChatParams(Add('messages', Items));
end;

function TChatParams.Stop(const Value: TArray<string>): TChatParams;
begin
  Result := TChatParams(Add('stop', Value));
end;

function TChatParams.Stop(const Value: string): TChatParams;
begin
  Result := TChatParams(Add('stop', Value));
end;

function TChatParams.Stream(const Value: Boolean): TChatParams;
begin
  Result := TChatParams(Add('stream', Value));
end;

function TChatParams.Temperature(const Value: Single): TChatParams;
begin
  Result := TChatParams(Add('temperature', Value));
end;

function TChatParams.TopP(const Value: Single): TChatParams;
begin
  Result := TChatParams(Add('top_p', Value));
end;

function TChatParams.User(const Value: string): TChatParams;
begin
  Result := TChatParams(Add('user', Value));
end;

{ TChatMessageBuild }

class function TChatMessageBuild.Assistant(const Content: string; const Name: string = ''): TChatMessageBuild;
begin
  Result.FRole := TMessageRole.Assistant;
  Result.FContent := Content;
  Result.FName := Name;
end;

class function TChatMessageBuild.AssistantFunc(const Function_Name: string;
  const Arguments: string): TChatMessageBuild;
begin
  Result.FRole := TMessageRole.Assistant;
  Result.FContent := 'null';
  Result.function_call := Function_Name;
  Result.Arguments := Arguments;
end;

class function TChatMessageBuild.Create(Role: TMessageRole; const Content: string; const Name: string = ''): TChatMessageBuild;
begin
  Result.FRole := Role;
  Result.FContent := Content;
  Result.FName := Name;
end;

class function TChatMessageBuild.Fonction(const Content,
  Name: string): TChatMessageBuild;
begin
  Result.FRole := TMessageRole.Fonction;
  Result.FContent := Content;
  Result.Name := Name;
end;

class function TChatMessageBuild.System(const Content: TBodyBuild;
  const Name: string): TChatMessageBuild;
begin
  Result.FRole := TMessageRole.System;
  Result.FContent := Content;
  Result.FName := Name;
end;

class function TChatMessageBuild.System(const Content: string; const Name: string = ''): TChatMessageBuild;
begin
  Result.FRole := TMessageRole.System;
  Result.FContent := Content;
  Result.FName := Name;
end;

class function TChatMessageBuild.User(const Content: string; const Name: string = ''): TChatMessageBuild;
begin
  Result.FRole := TMessageRole.User;
  Result.FContent := Content;
  Result.FName := Name;
end;

{ TMessageRoleHelper }

class function TMessageRoleHelper.FromString(const Value: string): TMessageRole;
begin
  if Value = 'system' then
    Exit(TMessageRole.System)
  else if Value = 'user' then
    Exit(TMessageRole.User)
  else if Value = 'assistant' then
    Exit(TMessageRole.Assistant)
  else if Value = 'function' then
    Exit(TMessageRole.Fonction)
  else
    Result := TMessageRole.User;
end;

function TMessageRoleHelper.ToString: string;
begin
  case Self of
    TMessageRole.System:
      Result := 'system';
    TMessageRole.User:
      Result := 'user';
    TMessageRole.Assistant:
      Result := 'assistant';
    TMessageRole.Fonction:
      Result := 'function';
  end;
end;

{ TChatChoices }

destructor TChatChoices.Destroy;
begin
  if Assigned(FMessage) then
    FMessage.Free;
  if Assigned(FDelta) then
    FDelta.Free;
  inherited;
end;

{ TChatFonctionBuild }

class function TChatFonctionBuild.Fonction(const Name: string;
  const Description: string; const ArgFunc: TBodyBuild): TChatFonctionBuild;
begin
  Result.Name := Name;
  Result.Description := Description;
  Result.Parameters := ArgFunc;
end;

class function TChatFonctionBuild.Fonction(const Name, Description,
  Arguments: string): TChatFonctionBuild;
begin
  Result.Name := Name;
  Result.Description := Description;
  Result.Parameters := Arguments;
end;

{ TChatMessage }

destructor TChatMessage.Destroy;
begin
  if Assigned(FFunction_call) then
    FFunction_call.Free;
  inherited;
end;

end.

For the test unit: "OpenAI.Chat0613"

unit OpenAI.Chat0613;

interface

uses
  System.SysUtils, System.Classes,
  OpenAI, OpenAI.API.Params, OpenAI.Chat, System.JSON;

type
  TChatResultData = record
  private
    FContent: string;
    FFunction_call: string;
    FArguments: string;
    FFinishReason: string;
  public
    /// <summary>
    /// Returns True if the "FinishReason" field is equal to the "Function_call"
    /// </summary>
    function isFunction_call: Boolean;
    /// <summary>
    /// Value of field "Content" from OpenAI.Chat.Choices[0] response
    /// </summary>
    property Content: string read FContent write FContent;
    /// <summary>
    /// Value of field "Function_call" from OpenAI.Chat.Choices[0] response
    /// </summary>
    property Function_call: string read FFunction_call write FFunction_call;
    /// <summary>
    /// Value of field "Arguments" from OpenAI.Chat.Choices[0] response
    /// </summary>
    property Arguments: string read FArguments write FArguments;
    /// <summary>
    /// Value of field "FinishReason" from OpenAI.Chat.Choices[0] response
    /// </summary>
    property FinishReason: string read FFinishReason write FFinishReason;
  end;

  /// <summary>
  /// Invoqued method for Internal or External selected API
  /// </summary>
  TApiFunction = reference to function(ResultData: TChatResultData): string;

  TFunctionItem = record
  private
    FFuncName: string;
    FDescription: string;
    FParameters: string;
    FApiFunction: TApiFunction;
  public
    /// <summary>
    /// Define Name of the function called
    /// </summary>
    property FuncName: string read FFuncName write FFuncName;
    /// <summary>
    /// Define Description of the function called
    /// </summary>
    property Description: string read FDescription write FDescription;
    /// <summary>
    /// Define JSON Parameters of the function called
    /// </summary>
    property Parameters: string read FParameters write FParameters;
    /// <summary>
    /// Define the method for internal or external API
    /// </summary>
    property ApiFunction: TApiFunction read FApiFunction write FApiFunction;
    /// <summary>
    /// Add a function in the Tank "Functions"
    /// </summary>
    class function Add(const Name: string; const Description: string;
      const ArgFunc: TBodyBuild; const AApiFunction: TApiFunction): TFunctionItem; static;
  end;

  TFunctions = class(TComponent)
  private
    FItems: TArray<TFunctionItem>;
    FCount: Integer;
    function GetItem(index: integer): TFunctionItem;
    procedure IndexChecked(const index: Integer);
  public
    constructor Create(AOwner: TComponent); override;
    /// <summary>
    /// Add an array of function in the Tank
    /// </summary>
    procedure Add(const Values: TArray<TFunctionItem>);
    /// <summary>
    /// Clear the Tank
    /// </summary>
    procedure Clear;
    /// <summary>
    /// Delete the function at index "index" in the Tank
    /// </summary>
    procedure Delete(index: Integer);
    /// <summary>
    /// Retrieve a function by her name in the Tank
    /// </summary>
    function FindByName(const SearchName: string; var Item: TFunctionItem): Boolean;
    /// <summary>
    /// The Count of functions in the Tank
    /// </summary>
    property Count: Integer read FCount write FCount;
    /// <summary>
    /// Function at the index "index"
    /// </summary>
    property Items[index: integer]: TFunctionItem read GetItem;
  end;

  /// <summary>
  /// Method used for Streaming display
  /// </summary>
  TStreamDisplayer = reference to procedure(const S: string);

  IChatQuery = interface
    ['{1C1BE1A4-A9B6-4781-A0EA-D9169BB0027A}']
    function GetKey: string;
    function GetMaxToken: Integer;
    function GetModel: string;
    procedure SetKey(const Value: string);
    procedure SetMaxToken(const Value: Integer);
    procedure SetModel(const Value: string);
    /// <summary>
    /// Define the display method when the stream is invoked with the "Response" method
    /// </summary>
    procedure SetStreamDisplayer(const Value: TStreamDisplayer);
    /// <summary>
    /// Defines a query for "OpenAI.Chat", using a function or functions defined by name.
    /// All functions are defined beforehand in the "Functions" container.
    /// Uses with models : gpt-3.5-turbo-0613, gpt-3.5-turbo-16k, gpt-4-0613
    /// </summary>
    function Response(const System: TBodyBuild; const User: string;
      const Function_Names: TArray<string> = []; Stream: Boolean = False): string; overload;
    /// <summary>
    /// Defines a query for "OpenAI.Chat"
    /// </summary>
    function Response(const System: TBodyBuild; const User: string;
      Stream: Boolean = False): string; overload;
    /// <summary>
    /// API user key
    /// </summary>
    property Key: string read GetKey write SetKey;
    /// <summary>
    /// The maximum number of tokens allowed for the generated answer. By default, the number of
    /// tokens the model can return will be 4096 - prompt tokens, 16384 with gpt-3.5-turbo-16k model
    /// or 8192 with gpt-4 and gpt-4-0613
    /// </summary>
    property MaxToken: Integer read GetMaxToken write SetMaxToken;
    /// <summary>
    /// ID of the model to use.
    /// </summary>
    property Model: string read GetModel write SetModel;
  end;

/// <summary>
/// Returns interface IChatQuery
/// </summary>
function IOpenAI_Chat: IChatQuery;

var
  Functions: TFunctions;

implementation

uses
  System.SysConst, System.StrUtils;

type
  TChatQuery = class(TInterfacedObject, IChatQuery)
  private
    FKey: string;
    FMaxToken: Integer;
    FModel: string;
    FFunctions: TFunctions;
    FStreamDisplayer: TStreamDisplayer;
    function GetKey: string;
    function GetMaxToken: Integer;
    function GetModel: string;
    procedure SetKey(const Value: string);
    procedure SetMaxToken(const Value: Integer);
    procedure SetModel(const Value: string);
    procedure SetFunctions(const Value: TFunctions);
  protected
    function Request(const System: TBodyBuild; const User: string;
      const Function_Names: TArray<string>): TChatResultData; overload;
    function Request(const System: TBodyBuild; const User: string;
      const DataResult: TChatResultData): string; overload;
    function StreamRequest(const System: TBodyBuild; const User: string;
      const Function_Names: TArray<string>): TChatResultData; overload;
    function StreamRequest(const System: TBodyBuild; const User: string;
      const DataResult: TChatResultData): string; overload;
    function ModelChecked: Boolean;
    procedure UpdateDisplay(const S: string);
    function FunctionBuilder(Function_Names: TArray<string>): TJSONArray;
  public
    constructor Create(const AFunctions: TFunctions);
    procedure SetStreamDisplayer(const Value: TStreamDisplayer);
    function Response(const System: TBodyBuild; const User: string;
      const Function_Names: TArray<string> = []; Stream: Boolean = False): string; overload;
    function Response(const System: TBodyBuild; const User: string;
      Stream: Boolean = False): string; overload;
    property Key: string read GetKey write SetKey;
    property MaxToken: Integer read GetMaxToken write SetMaxToken;
    property Model: string read GetModel write SetModel;
  end;

function IOpenAI_Chat: IChatQuery;
begin
  Result := TChatQuery.Create( Functions );
end;

{ TFunctionItem }

class function TFunctionItem.Add(const Name, Description: string;
  const ArgFunc: TBodyBuild; const AApiFunction: TApiFunction): TFunctionItem;
begin
  Result.FuncName := Name;
  Result.Description := Description;
  Result.Parameters := ArgFunc;
  Result.ApiFunction := AApiFunction;
end;

{ TFunctions }

procedure TFunctions.Add(const Values: TArray<TFunctionItem>);
begin
  for var Item in Values do begin
    Inc(FCount);
    SetLength(FItems, Count);
    FItems[Pred(Count)].FuncName := Item.FuncName;
    FItems[Pred(Count)].Description := Item.Description;
    FItems[Pred(Count)].Parameters := Item.Parameters;
    FItems[Pred(Count)].ApiFunction := Item.ApiFunction;
  end;
end;

procedure TFunctions.Clear;
begin
  FCount := 0;
  SetLength(FItems, FCount);
end;

constructor TFunctions.Create(AOwner: TComponent);
begin
  inherited Create(AOwner);
  FCount := 0;
end;

procedure TFunctions.Delete(index: Integer);
begin
  IndexChecked(index);
  Move(FItems[index + 1], FItems[index], (FCount - index) * SizeOf(TFunctionItem));
  FCount := FCount - 1;
  SetLength(FItems, FCount + 1);
end;

function TFunctions.FindByName(const SearchName: string;
  var Item: TFunctionItem): Boolean;
begin
  Result := False;
  for var X in FItems do
   if X.FuncName = SearchName then begin
     Item := X;
     Result := True;
     Break
   end;
end;

function TFunctions.GetItem(index: integer): TFunctionItem;
begin
  IndexChecked(index);
  Result := FItems[index];
end;

procedure TFunctions.IndexChecked(const index: Integer);
begin
  if (index < 0) or (index >= Count) then
    raise Exception.Create(System.SysConst.SVarArrayBounds);
end;

{ TChatQuery }

constructor TChatQuery.Create(const AFunctions: TFunctions);
begin
  inherited Create;
  FFunctions := AFunctions;
  {--- Use compatible model with management of functions }
  FModel := 'gpt-3.5-turbo-0613';
end;

function TChatQuery.FunctionBuilder(Function_Names: TArray<string>): TJSONArray;
var
  Item: TFunctionItem;
  JSON: TJSONObject;
  Items: TJSONArray;

  procedure AddItem; begin
    JSON := TJSONObject.Create;
    JSON.AddPair('name', Item.FuncName);
    JSON.AddPair('description', Item.Description);
    JSON.AddPair('parameters', TJSONObject.ParseJSONValue(Item.Parameters));
    Items.Add(JSON);
  end;

begin
  Items := TJSONArray.Create;
  for var Name in Function_Names do
   if FFunctions.FindByName(Name, Item) then AddItem;
  Result := Items;
end; {FunctionBuilder}

function TChatQuery.GetKey: string;
begin
  Result := FKey;
end;

function TChatQuery.GetMaxToken: Integer;
begin
  Result := FMaxToken;
end;

function TChatQuery.GetModel: string;
begin
  Result := FModel;
end;

function TChatQuery.ModelChecked: Boolean;
begin
  case IndexStr(Model, ['gpt-3.5-turbo-0613', 'gpt-3.5-turbo-16k', 'gpt-4-0613']) of
    -1 : Result := False;
    else Result := True;
  end;
end;

function TChatQuery.Request(const System: TBodyBuild; const User: string;
  const DataResult: TChatResultData): string;
var
  Item: TFunctionItem;
begin
  var OpenAI: IOpenAI := TOpenAI.Create(Key);
  {--- Evaluate internal or external api }
  var ApiValue := EmptyStr;
  if FFunctions.FindByName(DataResult.Function_call, Item) then
    with Item do
      if Assigned(FApiFunction) then ApiValue := FApiFunction(DataResult);
  {--- invoking openAI's Chat function }
  var Chat := OpenAI.Chat.Create(
      procedure(Params: TChatParams)
      begin
        Params.Model(Model);
        Params.Messages([
          TchatMessageBuild.System(System),
          TchatMessageBuild.User(User),
          TchatMessageBuild.AssistantFunc(DataResult.Function_call, DataResult.Arguments),
          TchatMessageBuild.Fonction(ApiValue, DataResult.Function_call)
        ]);
        if MaxToken > 0 then Params.MaxTokens(MaxToken);
      end);
  {--- Returns the ultimate answer }
  try
    for var Choice in Chat.Choices do
      Result := Choice.Message.Content;
  finally
    Chat.Free;
  end;
end;

function TChatQuery.Response(const System: TBodyBuild; const User: string;
  Stream: Boolean): string;
begin
  Result := Response(System, User, [], Stream);
end;

function TChatQuery.Response(const System: TBodyBuild; const User: string;
  const Function_Names: TArray<string>; Stream: Boolean): string;
begin
  case Stream of
    False : begin
      var DataResult := Request(System, User, Function_Names);
      if DataResult.isFunction_call
        then Result := Request(System, User, DataResult)
        else Result := DataResult.Content;
    end;
    else begin
      var DataResult := StreamRequest(System, User, Function_Names);
      if DataResult.isFunction_call
        then Result := StreamRequest(System, User, DataResult)
        else Result := DataResult.Content;
    end;
  end;
end;

function TChatQuery.Request(const System: TBodyBuild; const User: string;
  const Function_Names: TArray<string>): TChatResultData;
var
  Item: TFunctionItem;
begin
  var OpenAI: IOpenAI := TOpenAI.Create(Key);
  {--- Invoking openAI's Chat function }
  var Chat := OpenAI.Chat.Create(
      procedure(Params: TChatParams)
      begin
        Params.Model(Model);
        Params.Messages([
          TchatMessageBuild.System(System),
          TchatMessageBuild.User(User)
        ]);
        {--- Build the array of used functions }
        if ModelChecked then
          Params.Fonctions( FunctionBuilder(Function_Names) );
        if MaxToken > 0 then Params.MaxTokens(MaxToken);
      end);
  {--- Parsing the response - is there a function that needs to be invoked }
  try
    for var Choice in Chat.Choices do begin
      if ModelChecked then
      try
        Result.Function_call := Choice.Message.Function_call.Name;
        Result.Arguments := Choice.Message.Function_call.Arguments;
      except
      end;
      Result.Content := Choice.Message.Content;
      Result.FinishReason := Choice.FinishReason;
    end;
  finally
    Chat.Free;
  end;
end;

procedure TChatQuery.SetFunctions(const Value: TFunctions);
begin
  FFunctions := Value;
end;

procedure TChatQuery.SetKey(const Value: string);
begin
  FKey := Value;
end;

procedure TChatQuery.SetMaxToken(const Value: Integer);
begin
  FMaxToken := Value;
end;

procedure TChatQuery.SetModel(const Value: string);
begin
  FModel := Value;
end;

procedure TChatQuery.SetStreamDisplayer(const Value: TStreamDisplayer);
begin
  FStreamDisplayer := Value;
end;

function TChatQuery.StreamRequest(const System: TBodyBuild; const User: string;
  const DataResult: TChatResultData): string;
var
  Item: TFunctionItem;
begin
  var OpenAI: IOpenAI := TOpenAI.Create(Key);
  {--- Evaluate internal or external api }
  var ApiValue := EmptyStr;
  if FFunctions.FindByName(DataResult.Function_call, Item) then
    with Item do
      if Assigned(FApiFunction) then ApiValue := FApiFunction(DataResult);
  {--- Invoking openAI's Chat function }
  var Chat := OpenAI.Chat.CreateStream(
      procedure(Params: TChatParams)
      begin
        Params.Model(Model);
        Params.Messages([
          TchatMessageBuild.System(System),
          TchatMessageBuild.User(User),
          TchatMessageBuild.AssistantFunc(DataResult.Function_call, DataResult.Arguments),
          TchatMessageBuild.Fonction(ApiValue, DataResult.Function_call)
        ]);
        if MaxToken > 0 then Params.MaxTokens(MaxToken);
        Params.Stream;
      end,
      procedure(Chat: TChat; IsDone: Boolean; var Cancel: Boolean)
      begin
        if (not IsDone) and Assigned(Chat) then begin
          UpdateDisplay(Chat.Choices[0].Delta.Content);
        end
        else if IsDone then UpdateDisplay(#13);
      end);
end;

function TChatQuery.StreamRequest(const System: TBodyBuild; const User: string;
  const Function_Names: TArray<string>): TChatResultData;
var
  Item: TFunctionItem;
  Function_call: string;
  Arguments: string;
  Content: string;
begin
  var OpenAI: IOpenAI := TOpenAI.Create(Key);
  {--- Invoking openAI's Chat function }
  var Chat := OpenAI.Chat.CreateStream(
      procedure(Params: TChatParams)
      begin
        Params.Model(Model);
        Params.Messages([
          TchatMessageBuild.System(System),
          TchatMessageBuild.User(User)
        ]);
        {--- Build the array of used functions }
        if ModelChecked then
          Params.Fonctions( FunctionBuilder(Function_Names) );
        if MaxToken > 0 then Params.MaxTokens(MaxToken);
        Params.Stream;
      end,
      procedure(Chat: TChat; IsDone: Boolean; var Cancel: Boolean)
      begin
        if (not IsDone) and Assigned(Chat) then begin
          Content := Content + Chat.Choices[0].Delta.Content;
          UpdateDisplay(Chat.Choices[0].Delta.Content);
          try
            if Function_call = EmptyStr then
              Function_call := Chat.Choices[0].Delta.Function_call.Name;
            Arguments := Arguments + Chat.Choices[0].Delta.Function_call.Arguments;
          except
          end;
        end
        else if IsDone then UpdateDisplay(#13);
      end);
  {--- Parsing the response - is there a function that needs to be invoked }
  Result.Content := Content;
  Result.Function_call := Function_call;
  Result.Arguments := Arguments;
  if Function_call <> EmptyStr
    then Result.FinishReason := 'function_call'
    else Result.FinishReason := 'stop';
end;

procedure TChatQuery.UpdateDisplay(const S: string);
begin
  if Assigned(FStreamDisplayer) then FStreamDisplayer(S);
end;

{ TChatResultData }

function TChatResultData.isFunction_call: Boolean;
begin
  Result := FinishReason = 'function_call';
end;

initialization
  Functions := TFunctions.Create(nil);
finalization
  Functions.Free;
end.

Now to use 0613. We must instruct 'Functions' as follows:

Functions.Add([
    TFunctionItem.Add(
      'get_current_weather',                           //function name
      'Obtenir la météo actuelle à un endroit donné',  //function description
      function : string                                //function body
      begin
        with TStringWriter.Create do
        try
          WriteLine('{');
          WriteLine('  "type": "object",');
          WriteLine('  "properties": {');
          WriteLine('    "location": {');
          WriteLine('      "type": "string",');
          WriteLine('      "description": "La ville ou la région, par ex. Paris, Île-de-France"},');
          WriteLine('    "unit": {');
          WriteLine('      "type": "string",');
          WriteLine('      "enum": ["celsius", "fahrenheit"]}},');
          WriteLine('  "required": ["location"]');
          WriteLine('}');
          Result := ToString;
        finally
          Free;
        end;
      end,
      MeteoExecute),                                   //the function Api Code
    TFunctionItem.Add(
      'get_current_phonenumber',                       //function name
      'Get one of my phone numbers',                   //function description
      function : string                                //function description
      begin
        with TStringWriter.Create do
        try
          WriteLine('{');
          WriteLine('  "type": "object",');
          WriteLine('  "properties": {');
          WriteLine('    "phone": {');
          WriteLine('      "type": "string",');
          WriteLine('      "description": "Kind of phone e.g. mobile"}},');
          WriteLine('  "required": ["phone"]');
          WriteLine('}');
          Result := ToString;                          //the function Api Code
        finally
          Free;
        end;
      end,
      PhoneNumberExecute
       )
    ]);

Here, for running the function internally, add the following code

function TMainForm.MeteoExecute(ResultData: TChatResultData): string;
var
  City: string;
begin
  var X := TJsonObject.ParseJSONValue(ResultData.Arguments);
  try
    X.TryGetValue<string>('location', City);
    if City = 'Paris'
      then Result := '"température": "22", "unit": "celsius", "description": "ensoleillé"'
      else Result := 'météo inconnue à ' + City;
  finally
    X.Free;
  end;
end;

function TMainForm.PhoneNumberExecute(ResultData: TChatResultData): string;
var
  Phone: string;
begin
  var X := TJsonObject.ParseJSONValue(ResultData.Arguments);
  try
    X.TryGetValue<string>('phone', Phone);
    if Phone = 'mobile'
      then Result := '"number": "06 11 22 23 44"'
      else
    if Phone = 'home'
      then Result := '"number": "01 99 99 99 99"'
      else Result := 'no phone number for ' + Phone;
  finally
    X.Free;
  end;
end;

If the "Stream" effect is desired, then add the method in charge of the display

procedure TMainForm.UpdateDisplay(const S: string);
begin
  Memo1.Text := Memo1.Text + S;
  Application.ProcessMessages;
  Sleep(50);
end;

  1. And now to call a non-stream version with display in a memo
  with IOpenAI_Chat do begin
     Key := MyKey;
     MaxToken := 1024;
     SetStreamDisplayer(UpdateDisplay);
     memo1.Lines.Add(
       Response(
          function : string                                           //System
          begin
            Result := 'L''heure actuelle est '+TimeToStr(Now)+'.';
          end,
          Memo1.Text,                                                 //User
          ['get_current_weather', 'get_current_phonenumber']));       //Array of functions names
   end;
  1. For a Stream effect
  with IOpenAI_Chat do begin
     Key := MyKey;
     MaxToken := 1024;
     Model := 'gpt-4-0613';
     SetStreamDisplayer(UpdateDisplay);
       Response(
          function : string                                           //System
          begin
            Result := 'The current time is '+TimeToStr(Now)+'.';
          end,
          Memo1.Text,                                                 //User
          ['get_current_weather', 'get_current_phonenumber'],         //Array of functions names
          True );                                                     //Stream mode
   end;
HemulGM commented 1 year ago

Ooh, I already did it. A little different, tested and built into the client image

MaxiDonkey commented 1 year ago

How does code behave if in your call, there are several functions called? A bit like in my last example?

HemulGM commented 1 year ago

I'm still testing in the client. Added only one function for the weather so far (with real data from OWM). I've added an interface that provides a list of functions and their data to pass along with the request. Using interfaces will also allow you to use external libraries with functions directly (like a plugin). Refactored the code to include your changes.

MaxiDonkey commented 1 year ago

You must also be careful when using a function with a template that does not support them, for example gpt-3.5-turbo. I had to check the model before adding the functions in the query.

HemulGM commented 1 year ago

I now, thx) image

MaxiDonkey commented 1 year ago

Don't forget gpt-3.5-turbo-16k (function can be used) Max token 16384 Function work with gpt-4-0613 too Max token 8192 gpt-3.5-turbo-0613 --> max token 8192