rodion-m / ChatGPT_API_dotnet

OpenAI, Azure OpenAI and OpenRouter Chat Completions (ChatGPT) .NET integration with DI, persistence and streaming support
MIT License
79 stars 13 forks source link

How to use function calling and model config in your package? #12

Closed walter-234 closed 1 month ago

walter-234 commented 1 month ago

Hi @rodion-m, thanks for your updates recently, that helps me a lot. However, I still have a question. Currently, I want to convert config from python code to c#, the python-dev give me these config but I only apply some of them, can you help me check if I can config FUNCTION and other parameters of MODEL CONFIG below.

PROMPT:
{hidden} **_Config Done_**

FUNCTION: (Need help)
{
  "type": "object",
  "title": "classify task",
  "required": [
    "index"
  ],
  "properties": {
    "index": {
      "type": "integer",
      "description": "description task of index"
    }
  },
  "description": "description task"
}

MODEL CONFIG: (Need help)
Model: gpt-4o-mini **_Config Done_**
Temperature: 0.2 **_Config Done_**
Top P: 0.1
Presence Penalty: 0
Frequency Penalty: 0
rodion-m commented 1 month ago

Unfortunately, the Function Calling feature is not implemented in this library. It's implemented for C# in semantic kernel or in https://github.com/tryAGI/LangChain

As for MODEL CONFIG, you can modify it as you wish using requestModifier parameter (in GetChatCompletions for example).

walter-234 commented 1 month ago

Thank you so much. Close now