MetaGLM / glm-cookbook

Examples and guides for using the GLM APIs
https://open.bigmodel.cn/
Apache License 2.0
726 stars 79 forks source link

怎样让GLM实现代码生成并执行的功能 #18

Closed ichat001 closed 5 months ago

ichat001 commented 5 months ago

Feature request / 功能建议

我通过阅读下面链接的文章, 想实现一个类似的AI代码生成并执行的功能 https://github.com/MetaGLM/glm-cookbook/blob/main/demo/glm_csv_data_analysis.ipynb

代码如下, 执行后没有执行代码, 能帮忙指导实现一下吗?


from zhipuai import ZhipuAI
import json

client = ZhipuAI()

def execute_cleaned_code_from_string(code_string: str = ""):
    import io
    from contextlib import redirect_stdout

    output_buffer = io.StringIO()
    try:
        code_object = compile(code_string, '<string>', 'exec')
        with redirect_stdout(output_buffer):
            exec(code_object)
        return output_buffer.getvalue() if output_buffer.getvalue() else "代码成功完成!"
    except Exception as e:
        error = "追溯:发生了错误: " + str(e)
        print(error)
        return error

def extract_function_and_execute(llm_output, messages):
    name = llm_output.choices[0].message.tool_calls[0].function.name
    params = json.loads(llm_output.choices[0].message.tool_calls[0].function.arguments)
    tool_call_id = llm_output.choices[0].message.tool_calls[0].id
    function_to_call = globals().get(name)
    if not function_to_call:
        raise ValueError(f"函数 '{name}' 没找到")

    messages.append(llm_output.choices[0].message.model_dump())

    messages.append(
        {
            "role": "tool",
            "content": str(function_to_call(**params)),
            "tool_call_id": tool_call_id
        }
    )
    return messages

tools = [
    {
        "type": "function",
        "function": {
            "name": "execute_cleaned_code_from_string",
            "description": "python code execution tool",
            "parameters": {
                "type": "object",
                "properties": {
                    "code_string": {
                        "type": "string",
                        "description": "Python executable code",
                    },
                },
                "required": ["code_string"],
            },
        }
    }
]

sys_prompt = """
你的任务是: 根据用户的要求写文章, 如果用户额外需要用打印机打印文章, 那你就先生成具有打印该文章的可执行的python代码,代码中要使用系统上默认的打印机打印.
我将为您提供执行python代码的工具,您只需按要求编写可执行的python代码, 用该工具将执行你写的代码并返回结果
最后,根据打印的结果,通知用户打印成功或失败
"""

question = "给我写个60字左右的关于春天来了的小作文,然后打印出来"
messages = [
    {
        "role": "system",
        "content": sys_prompt
    },
    {
        "role": "user",
        "content": question
    }
]
response = client.chat.completions.create(
    model='glm-4',
    messages=messages,
    tools=tools,
    top_p=0.1,
    temperature=0.1,
    max_tokens=2000,
)
print(response.choices[0].message.content)
# tool_calls = response.choices[0].message.tool_calls
# code_string = json.loads(tool_calls[0].function.arguments)['code_string']
# print(f"__Python代码: {code_string}") #Python代码

extract_function_and_execute(llm_output=response, messages=messages)
print(f"__Python代码执行后的消息: {messages}") #代码执行后的消息

Motivation / 动机

体验AI代码生成并执行功能

Your contribution / 您的贡献

zRzRzRzRzRzRzR commented 5 months ago

所有代码解释器的本质是通过外部沙盒调用LLM生成的代码执行并作为function call的内容返回,这是工程化的内容,不能通过API解决