cpacker / MemGPT

Create LLM agents with long-term memory and custom tools 📚🦙
https://memgpt.readme.io
Apache License 2.0
11.32k stars 1.23k forks source link

KeyError: 'preset' in memgpt_agent.py (memgpt+autogen) #500

Closed hherpa closed 9 months ago

hherpa commented 9 months ago

I am running autogen with multi-LLMs and memgpt in google collaboratory and I have an error:

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
[<ipython-input-13-2188bc5300b4>](https://localhost:8080/#) in <cell line: 55>()
     71         )
     72     else:
---> 73         coder = create_memgpt_autogen_agent_from_config(
     74             "MemGPT_coder",
     75             llm_config=llm_config_memgpt,

[/content/MemGPT/memgpt/autogen/memgpt_agent.py](https://localhost:8080/#) in create_memgpt_autogen_agent_from_config(name, system_message, is_termination_msg, max_consecutive_auto_reply, human_input_mode, function_map, code_execution_config, llm_config, nonmemgpt_llm_config, default_auto_reply, interface_kwargs)
     55         persona=persona_desc,
     56         human=user_desc,
---> 57         preset=llm_config["preset"],
     58         model=llm_config["model"],
     59         model_wrapper=llm_config["model_wrapper"],

KeyError: 'preset'

before running my script with memgpt/autogen, I run the following commands:

!pip install pyautogen
!git clone https://github.com/cpacker/MemGPT.git
cd MemGPT
!pip install -e .
!export OPENAI_API_BASE=https://comics-funny-relief-influence.trycloudflare.com/  # тут находиться мой API ключ wizardcoder который я генерирую каждый раз как запускаю autogen`
!export BACKEND_TYPE=webui

and here is my autogen/memgpt script:

import os
import autogen
import openai

from MemGPT.memgpt.autogen.memgpt_agent import create_autogen_memgpt_agent, create_memgpt_autogen_agent_from_config

# This config is for autogen agents that powered by MemGPT
config_list_memgpt = [
    {
        "model": "gpt-4",
    },
]

config_list = [
    {
        "model": "mistral-7b",
        "api_base": "https://stranger-journals-opponent-uw.trycloudflare.com/v1",
        "api_key": "NULL",  # this is a placeholder
        "api_type": "open_ai",
    },
]

USE_MEMGPT = True
USE_AUTOGEN_WORKFLOW = True
DEBUG = False

interface_kwargs = {
    "debug": DEBUG,
    "show_inner_thoughts": DEBUG,
    "show_function_outputs": DEBUG,
}

llm_config = {"config_list": config_list, "seed": 42}
llm_config_memgpt = {"config_list": config_list_memgpt, "seed": 42}

# The user agent
user_proxy = autogen.UserProxyAgent(
    name="User_proxy",
    system_message="A human admin.",
    code_execution_config={"last_n_messages": 2, "work_dir": "groupchat"},
    human_input_mode="TERMINATE",
    default_auto_reply="...",
)

# The agent playing the role of the product manager (PM)
pm = autogen.AssistantAgent(
    name="Product_manager",
    system_message="Creative in software product ideas.",
    llm_config=llm_config,
    default_auto_reply="...",
)

if not USE_MEMGPT:
    coder = autogen.AssistantAgent(
        name="Coder",
        llm_config=llm_config,
    )

else:
    if not USE_AUTOGEN_WORKFLOW:
        coder = create_autogen_memgpt_agent(
            "MemGPT_coder",
            persona_description="I am a 10x engineer, trained in Python. I was the first engineer at Uber "
            "(which I make sure to tell everyone I work with).",
            user_description=f"You are participating in a group chat with a user ({user_proxy.name}) "
            f"and a product manager ({pm.name}).",
            model=config_list_memgpt[0]["model"],
            interface_kwargs=interface_kwargs,
        )
    else:
        coder = create_memgpt_autogen_agent_from_config(
            "MemGPT_coder",
            llm_config=llm_config_memgpt,
            system_message=f"I am a 10x engineer, trained in Python. I was the first engineer at Uber "
            f"(which I make sure to tell everyone I work with).\n"
            f"You are participating in a group chat with a user ({user_proxy.name}) "
            f"and a product manager ({pm.name}).",
            interface_kwargs=interface_kwargs,
        )

groupchat = autogen.GroupChat(agents=[user_proxy, pm, coder], messages=[], max_round=12)
manager = autogen.GroupChatManager(groupchat=groupchat, llm_config=llm_config)

user_proxy.initiate_chat(
    manager,
    message="I want to design an app to make me one million dollars in one month. " "Yes, your heard that right.",
)

it is this code that gives the error KeyError: 'preset' referring to the file 'memgpt_agent.py ', namely on this part:

agent_config = AgentConfig(
        name=name,
        persona=persona_desc,
        human=user_desc,
        preset=llm_config["preset"],
        model=llm_config["model"],
        model_wrapper=llm_config["model_wrapper"],
        model_endpoint_type=llm_config["model_endpoint_type"],
        model_endpoint=llm_config["model_endpoint"],
        context_window=llm_config["context_window"],
    )

image

hherpa commented 9 months ago

my google collab with memgpt/autogen: link google collaboratory

mclassen commented 9 months ago

Try this: import memgpt.presets.presets as presets

hherpa commented 9 months ago

@mclassen I changed the import of modules:

import os
import autogen
import openai

from memgpt.autogen.memgpt_agent import create_autogen_memgpt_agent, create_memgpt_autogen_agent_from_config
import memgpt.presets.presets as presets

But the error is exactly the same:

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
[<ipython-input-18-6c3d7486ceca>](https://localhost:8080/#) in <cell line: 56>()
     72         )
     73     else:
---> 74         coder = create_memgpt_autogen_agent_from_config(
     75             "MemGPT_coder",
     76             llm_config=llm_config_memgpt,

/usr/local/lib/python3.10/dist-packages/memgpt/autogen/memgpt_agent.py in create_memgpt_autogen_agent_from_config(name, system_message, is_termination_msg, max_consecutive_auto_reply, human_input_mode, function_map, code_execution_config, llm_config, nonmemgpt_llm_config, default_auto_reply, interface_kwargs)

KeyError: 'preset'
hherpa commented 9 months ago

I changed the import a little more:

import os
import autogen
import openai

from memgpt.autogen.memgpt_agent import create_autogen_memgpt_agent, create_memgpt_autogen_agent_from_config
import memgpt.presets.presets as presets
from memgpt.presets.presets import DEFAULT_PRESET

and the error changed as if "preset": DEFAULT_PRESETS could be found:

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
[<ipython-input-36-29df25bb2622>](https://localhost:8080/#) in <cell line: 59>()
     75         )
     76     else:
---> 77         coder = create_memgpt_autogen_agent_from_config(
     78             "MemGPT_coder",
     79             llm_config=llm_config_memgpt,

/usr/local/lib/python3.10/dist-packages/memgpt/autogen/memgpt_agent.py in create_memgpt_autogen_agent_from_config(name, system_message, is_termination_msg, max_consecutive_auto_reply, human_input_mode, function_map, code_execution_config, llm_config, nonmemgpt_llm_config, default_auto_reply, interface_kwargs)

KeyError: 'model_wrapper'

but if you also add:

from memgpt.presets.presets import model_wrapper

the code will output an error:

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
[<ipython-input-37-b8d2aac7a11c>](https://localhost:8080/#) in <cell line: 9>()
      7 from memgpt.presets.presets import DEFAULT_PRESET
      8 
----> 9 from memgpt.presets.presets import model_wrapper
     10 model_wrapper = None
     11 # This config is for autogen agents that powered by MemGPT

ImportError: cannot import name 'model_wrapper' from 'memgpt.presets.presets' (/usr/local/lib/python3.10/dist-packages/memgpt/presets/presets.py)
hherpa commented 9 months ago

referring to: https://github.com/cpacker/MemGPT/blob/main/memgpt/autogen/examples/agent_groupchat.py I wrote my config _list:

config_list = [
    {
        "model": "mistral-7b",
        "api_base": "https://few-ot-cooper-attacks.trycloudflare.com/v1",
        "api_key": "NULL",  # this is a placeholder
        "api_type": "open_ai",
        "model_wrapper": None,
        "model_endpoint_type": "webui",  # can use webui, ollama, llamacpp, etc.
        "model_endpoint": "https://few-ot-cooper-attacks.trycloudflare.com/v1",  # the IP address of your LLM backend
        "context_window": 8192,
    },
]

image and the error has changed: image

sarahwooders commented 9 months ago

@cpacker was this fixed with #498?

sarahwooders commented 9 months ago

@hherpa can you share the output of memgpt version?

hherpa commented 9 months ago

@hherpa can you share the output of memgpt version?

@sarahwooders 0.1.16 , but I have a new version now and the error is for sure