kubeagi / arcadia

A diverse, simple, and secure one-stop LLMOps platform
http://www.kubeagi.com/
Apache License 2.0
76 stars 23 forks source link

feat: Integrate with agent/tool agent from go-llm #896

Open nkwangleiGIT opened 5 months ago

nkwangleiGIT commented 5 months ago

Check if any assets we can reuse from https://github.com/natexcvi/go-llm.

Abirdcfly commented 5 months ago

It's a pretty creative program. But only OpenAI's GPT chat completion API is supported. and use Model-Native Function Calls, the doc is https://platform.openai.com/docs/api-reference/chat/create#chat-create-function_call one example is:

curl https://api.openai.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
  "model": "gpt-3.5-turbo",
  "messages": [
    {
      "role": "user",
      "content": "What is the weather like in Boston?"
    }
  ],
  "tools": [
    {
      "type": "function",
      "function": {
        "name": "get_current_weather",
        "description": "Get the current weather in a given location",
        "parameters": {
          "type": "object",
          "properties": {
            "location": {
              "type": "string",
              "description": "The city and state, e.g. San Francisco, CA"
            },
            "unit": {
              "type": "string",
              "enum": ["celsius", "fahrenheit"]
            }
          },
          "required": ["location"]
        }
      }
    }
  ],
  "tool_choice": "auto"
}'

the output will be

curl https://api.openai.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
  "model": "gpt-3.5-turbo",
  "messages": [
    {
      "role": "user",
      "content": "What is the weather like in Boston?"
    }
  ],
  "tools": [
    {
      "type": "function",
      "function": {
        "name": "get_current_weather",
        "description": "Get the current weather in a given location",
        "parameters": {
          "type": "object",
          "properties": {
            "location": {
              "type": "string",
              "description": "The city and state, e.g. San Francisco, CA"
            },
            "unit": {
              "type": "string",
              "enum": ["celsius", "fahrenheit"]
            }
          },
          "required": ["location"]
        }
      }
    }
  ],
  "tool_choice": "auto"
}'

from lm-sys/FastChat #2214 fastchat is not support it yet. And zhipuAI v4 support function_call.

Abirdcfly commented 5 months ago

目前只有 ChatGLM3-6B 模型支持工具调用,而 ChatGLM3-6B-Base 和 ChatGLM3-6B-32K 模型不支持。 https://github.com/THUDM/ChatGLM3/blob/main/tools_using_demo/README.md

Abirdcfly commented 5 months ago

Qwen-Chat 的套路和我们现在的一致,使用react方法,把工具写在prompt里: https://github.com/QwenLM/Qwen/blob/main/examples/react_prompt.md

基于该原理,我们在 openai_api.py 里提供了函数调用(Function Calling)的支持。

感觉如果我们也想让FastChat也实现OpenAI标准的 function_call 方法,也可以这么做。