microsoft / FLAML

A fast library for AutoML and tuning. Join our Discord: https://discord.gg/Cppx2vSPVP.
https://microsoft.github.io/FLAML/
MIT License
3.85k stars 506 forks source link

AutoGen - Unable to run my own functions #1201

Closed SlistInc closed 1 year ago

SlistInc commented 1 year ago

My objective:

My code:

from flaml import autogen

llm_config={
    "model": "gpt4",
    #"model": "llama-7B",
    "api_base": "http://localhost:1234/v1",
    "api_type": "open_ai",
    "api_key": "NULL",
    "max_tokens": 1000,
    "request_timeout": 1000,
    "seed": 42,
    "temperature": 0,
    "max_round": 20,
    "use_cache": False,
}
llm_config["functions"] = [
        {
            "name": "create_test_file",
            "description": "Create a file and write the word 'test' in it.",
            "parameters": {
                "type": "object",
                "properties": {
                    "filename": {
                        "type": "string",
                        "description": "A filename.",
                    },
                },
            },
        }
    ]

def create_test_file(filename="output"):
    content = "Test"
    print(f"DEBUG: creating {filename} with content: {content}")
    try:
        filename = filename + ".txt"
        with open(filename, "w") as f:
            f.write(content)
        f.close()
    except Exception as err:
        message = f"Something went wrong. Test was NOT written to the file. I got this error back: {type(err)=}\n{e=}"
        return message
    return f"I confirm the word Test was successfully written to file {filename}."

chatbot = autogen.AssistantAgent(
    name="chatbot",
    system_message="Only use the functions you have been provided with. Do not ask user to perform other actions than executing the functions. Reply TERMINATE when the task is done.",
    llm_config=llm_config,
)

user_proxy = autogen.UserProxyAgent(
    "user_proxy",
    max_consecutive_auto_reply=2,
    human_input_mode="NEVER",
    function_map={"create_test_file": create_test_file},
)

user_proxy.initiate_chat(
    chatbot,
    message="Write the word 'Test' to a file.",
)

Expected outcome:

Observed outcome:

My question:

yiranwu0 commented 1 year ago

Hello, can you try this out? I modified the llm_config, and I am using model gpt-3.5. Remember to put your key in "key_openai.txt", or pass in the key through other formats.

I am not sure you can use local models to achieve similar behavior of "calling functions", this depends on the capability of your models. OpenAI specifically fine-tuned their models to support function calls: https://openai.com/blog/function-calling-and-other-api-updates.

from flaml import autogen
from flaml.autogen import oai
import os
os.environ["OPENAI_API_KEY"] = open("key_openai.txt", "r").read().strip()
config_list = oai.config_list_gpt4_gpt35()

# create an AssistantAgent named "assistant"
llm_config={
    "model": "gpt-3.5",
    "use_cache": False,
    "config_list": config_list,  # a list of OpenAI API configurations
    "temperature": 0,  # temperature for sampling
    # "seed": 42,  # no need for seed if use_cache is False, do not set seed and use_cache=False at the same time.
    }

llm_config["functions"] = [
        {
            "name": "create_test_file",
            "description": "Create a file and write the word 'test' in it.",
            "parameters": {
                "type": "object",
                "properties": {
                    "filename": {
                        "type": "string",
                        "description": "A filename.",
                    },
                },
            },
        }
    ]

def create_test_file(filename="output"):
    content = "Test"
    print(f"DEBUG: creating {filename} with content: {content}")
    try:
        filename = filename + ".txt"
        with open(filename, "w") as f:
            f.write(content)
        f.close()
    except Exception as err:
        message = f"Something went wrong. Test was NOT written to the file. I got this error back: {type(err)=}\n{e=}"
        return message
    return f"I confirm the word Test was successfully written to file {filename}."

chatbot = autogen.AssistantAgent(
    name="chatbot",
    system_message="Only use the functions you have been provided with. Do not ask user to perform other actions than executing the functions. Reply TERMINATE when the task is done.",
    llm_config=llm_config,
)

user_proxy = autogen.UserProxyAgent(
    "user_proxy",
    max_consecutive_auto_reply=2,
    human_input_mode="NEVER",
    function_map={"create_test_file": create_test_file},
)

user_proxy.initiate_chat(
    chatbot,
    message="Write the word 'Test' to a file.",
)

here is the result:

user_proxy (to chatbot):

Write the word 'Test' to a file.

--------------------------------------------------------------------------------
chatbot (to user_proxy):

***** Suggested function Call: create_test_file *****
Arguments: 
{
  "filename": "test_file.txt"
}
*****************************************************

--------------------------------------------------------------------------------

>>>>>>>> EXECUTING FUNCTION create_test_file...
DEBUG: creating test_file.txt with content: Test
user_proxy (to chatbot):

***** Response from calling function "create_test_file" *****
I confirm the word Test was successfully written to file test_file.txt.txt.
*************************************************************

--------------------------------------------------------------------------------
chatbot (to user_proxy):

TERMINATE

--------------------------------------------------------------------------------
SlistInc commented 1 year ago

Thanks for the feedback. Ah, that's the issue then. My setup is for local models as I don't want to use OpenAI but want to find a local setup. I will try to use some function calling local optimised models... or some special prompts. I guess otherwise function calling with local agents won't work out of the box.

Edit: closed this issue as it is not a real issue and @kevin666aa gave good feedback.

yiranwu0 commented 1 year ago

You are welcome!