THUDM / ChatGLM3

ChatGLM3 series: Open Bilingual Chat LLMs | 开源双语对话语言模型
Apache License 2.0
13.44k stars 1.56k forks source link

function call的疑问 #1036

Closed Smile-L-up closed 7 months ago

Smile-L-up commented 7 months ago

System Info / 系統信息

windows(gradio那个demo跑不了,其他好像可以) cuda 12.3 python 3.10

Who can help? / 谁可以帮助到您?

No response

Information / 问题信息

Reproduction / 复现过程

1、运行api_server.y 2、执行openai_api_request.py,只执行function call,然后自定义了一下get_current_weather函数。

"""
This script is an example of using the OpenAI API to create various interactions with a ChatGLM3 model.
It includes functions to:

1. Conduct a basic chat session, asking about weather conditions in multiple cities.
2. Initiate a simple chat in Chinese, asking the model to tell a short story.
3. Retrieve and print embeddings for a given text input.

Each function demonstrates a different aspect of the API's capabilities, showcasing how to make requests
and handle responses.
"""

from openai import OpenAI

base_url = "http://127.0.0.1:8000/v1/"
client = OpenAI(api_key="EMPTY", base_url=base_url)

def get_current_weather(location, unit):
    # 在这个示例中,根据地点和单位,返回一个固定的温度值
    if location == 'San Francisco, CA' and unit == 'celsius':
        return '10 C'
    elif location == 'San Francisco' and unit == 'fahrenheit':
        return '50 F'
    elif location == 'Tokyo' and unit == 'celsius':
        return '20 C'
    elif location == 'Tokyo' and unit == 'fahrenheit':
        return '68 F'
    elif location == 'Paris' and unit == 'celsius':
        return '15 C'
    elif location == 'Paris' and unit == 'fahrenheit':
        return '59 F'
    else:
        return 'Location or unit not supported'

def function_chat():
    messages = [{"role": "user", "content": "What's the weather like in San Francisco, Tokyo, and Paris?"}]
    tools = [
        {
            "type": "function",
            "function": {
                "name": "get_current_weather",
                "description": "Get the current weather in a given location",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "location": {
                            "type": "string",
                            "description": "The city and state, e.g. San Francisco, CA",
                        },
                        "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
                    },
                    "required": ["location"],
                },
            },
        }
    ]

    response = client.chat.completions.create(
        model="chatglm3-6b",
        messages=messages,
        tools=tools,
        tool_choice="auto",
    )
    if response:
        content = response.choices[0].message.content
        print(content)
    else:
        print("Error:", response.status_code)

def simple_chat(use_stream=True):
    messages = [
        {
            "role": "system",
            "content": "You are ChatGLM3, a large language model trained by Zhipu.AI. Follow the user's "
                       "instructions carefully. Respond using markdown.",
        },
        {
            "role": "user",
            "content": "你好,请你用生动的话语给我讲一个小故事吧"
        }
    ]
    response = client.chat.completions.create(
        model="chatglm3-6b",
        messages=messages,
        stream=use_stream,
        max_tokens=256,
        temperature=0.8,
        presence_penalty=1.1,
        top_p=0.8)
    if response:
        if use_stream:
            for chunk in response:
                print(chunk.choices[0].delta.content)
        else:
            content = response.choices[0].message.content
            print(content)
    else:
        print("Error:", response.status_code)

def embedding():
    response = client.embeddings.create(
        model="bge-large-zh-1.5",
        input=["你好,给我讲一个故事,大概100字"],
    )
    embeddings = response.data[0].embedding
    print("嵌入完成,维度:", len(embeddings))

if __name__ == "__main__":
    # simple_chat(use_stream=False)
    # simple_chat(use_stream=True)
    # embedding()
    function_chat()

输出

get_current_weather
 ```python
tool_call(location='San Francisco, CA', unit='celsius')

Expected behavior / 期待表现

期望的大概输出:San Francisco, CA 的温度为 10 C。

期待您的回复。