microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
31.24k stars 4.56k forks source link

[Bug] LLM returned a function response, but userProxyAgent didn't exec function #1065

Open lucasjinreal opened 9 months ago

lucasjinreal commented 9 months ago
image

Every weired, the return is in role function, but userProxy just return LLM the json output rather than exec it?

config_list = (
    {
        "model": "qwen-v2",
        "base_url": "http://127.0.0.1:8082/v1",
        "api_key": "NULL",
        "functions": [
            {
                "name": "exchange_rate",
                "description": "for exchange rate",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "base_amount": {
                            "type": "float",
                            "description": "amount to calcualate",
                        }
                    },
                },
            },
            {
                "name": "search_google",
                "description": "根据关键词搜索结果",
                "parameters": {
                    "type": "object",
                    "properties": {
                        "keyword": {
                            "type": "string",
                            "description": "The keyword that's used to search",
                        }
                    },
                    "required": ["keyword"],
                },
            },
        ],
    },
)

llm_config = {
    "timeout": 600,
    "cache_seed": 42,
    "config_list": config_list,
    "temperature": 0,
}

assistant = autogen.AssistantAgent(
    name="assistant",
    llm_config=llm_config,
    system_message="You are helpful agent that can select right tools to fullfill users request. DO NOT response rest info except the json format response.",
)
# create a UserProxyAgent instance named "user_proxy"
user_proxy = autogen.UserProxyAgent(
    name="user_proxy",
    # human_input_mode="TERMINATE",
    human_input_mode="NEVER",
    max_consecutive_auto_reply=10,
    is_termination_msg=lambda x: x.get("content", "").rstrip().endswith("TERMINATE"),
    code_execution_config={"work_dir": "web"},
    llm_config=llm_config,
    system_message="""Reply TERMINATE if the task has been solved at full satisfaction.
Otherwise, reply CONTINUE, or the reason why the task is not solved yet.""",
    function_map={
        "exchange_rate": currency_calculator,
        "search_google": search_google_news,
    },
)

# user_proxy.register_function(function_map=)

print(assistant.llm_config)

user_proxy.initiate_chat(
    assistant,
    message="汇总今天关于马斯克的新闻头条内容,发送一个markdown格式报告给我",
)
rickyloynd-microsoft commented 9 months ago

@kevin666aa

thinkall commented 3 months ago

Looks like a failure of qwen's function call is not fully compatible with openai's.