cpacker / MemGPT

Letta (fka MemGPT) is a framework for creating stateful LLM services.
https://letta.com
Apache License 2.0
12.05k stars 1.33k forks source link

MemGPT AutoGen with Local LLM Error #720

Closed naseerfaheem closed 10 months ago

naseerfaheem commented 10 months ago

Describe the bug When trying to run examples from the autogen directory with local LLMs (WebUI and LMstudio), I get the OpenAI API error: File "/home/user1/Documents/DataScience/installs/MemGPT/memgpt/autogen/examples/agent_docs.py", line 148, in memgpt_agent = create_memgpt_autogen_agent_from_config( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user1/.conda/envs/memgpt/lib/python3.11/site-packages/memgpt/autogen/memgpt_agent.py", line 95, in create_memgpt_autogen_agent_from_config autogen_memgpt_agent = create_autogen_memgpt_agent( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user1/.conda/envs/memgpt/lib/python3.11/site-packages/memgpt/autogen/memgpt_agent.py", line 184, in create_autogen_memgpt_agent autogen_memgpt_agent = MemGPTAgent( ^^^^^^^^^^^^ File "/home/user1/.conda/envs/memgpt/lib/python3.11/site-packages/memgpt/autogen/memgpt_agent.py", line 204, in init super().init(name) File "/home/user1/.conda/envs/memgpt/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 117, in init self.client = OpenAIWrapper(self.llm_config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user1/.conda/envs/memgpt/lib/python3.11/site-packages/autogen/oai/client.py", line 83, in init self._clients = [self._client(extra_kwargs, openai_config)] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user1/.conda/envs/memgpt/lib/python3.11/site-packages/autogen/oai/client.py", line 138, in _client client = OpenAI(openai_config) ^^^^^^^^^^^^^^^^^^^^^^^ File "/home/user1/.conda/envs/memgpt/lib/python3.11/site-packages/openai/_client.py", line 92, in init raise OpenAIError( openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

Please describe your setup OS: Ubuntu 22.04 desktop HW: HP Z8 Fury with 4 RTX 6000 ADAs and 1TB RAM python version = 3.11.3 (I have tried the 3.10.9) with Conda

Screenshots If applicable, add screenshots to help explain your problem. image

Additional context Add any other context about the problem here.


If you're not using OpenAI, please provide additional information on your local LLM setup:

Local LLM details

If you are trying to run MemGPT with local LLMs, please provide the following information:

naseerfaheem commented 10 months ago

I tried creating a new environment with all the recommended steps, still the same api error. Here is the full error:

OpenAIError Traceback (most recent call last) Cell In[13], line 11 3 coder = autogen.AssistantAgent( 4 name="Coder", 5 llm_config=llm_config, 6 ) 8 else: 9 # In our example, we swap this AutoGen agent with a MemGPT agent 10 # This MemGPT agent will have all the benefits of MemGPT, ie persistent memory, etc. ---> 11 coder = create_memgpt_autogen_agent_from_config( 12 "MemGPT_coder", 13 llm_config=llm_config_memgpt, 14 system_message=f"I am a 10x engineer, trained in Python. I was the first engineer at Uber " 15 f"(which I make sure to tell everyone I work with).\n" 16 f"You are participating in a group chat with a user ({user_proxy.name}) " 17 f"and a product manager ({pm.name}).", 18 interface_kwargs=interface_kwargs, 19 default_auto_reply="...", # Set a default auto-reply message here (non-empty auto-reply is required for LM Studio) 20 skip_verify=False, # NOTE: you should set this to True if you expect your MemGPT AutoGen agent to call a function other than send_message on the first turn 21 ) 23 # Initialize the group chat between the user and two LLM agents (PM and coder) 24 groupchat = autogen.GroupChat(agents=[user_proxy, pm, coder], messages=[], max_round=12)

File ~/Documents/DataScience/installs/MemGPT/memgpt/autogen/memgpt_agent.py:95, in create_memgpt_autogen_agent_from_config(name, system_message, is_termination_msg, max_consecutive_auto_reply, human_input_mode, function_map, code_execution_config, llm_config, nonmemgpt_llm_config, default_auto_reply, interface_kwargs, skip_verify) 92 if function_map is not None or code_execution_config is not None: 93 raise NotImplementedError ---> 95 autogen_memgpt_agent = create_autogen_memgpt_agent( 96 agent_config, 97 default_auto_reply=default_auto_reply, 98 is_termination_msg=is_termination_msg, 99 interface_kwargs=interface_kwargs, 100 skip_verify=skip_verify, 101 ) 103 if human_input_mode != "ALWAYS": 104 coop_agent1 = create_autogen_memgpt_agent( 105 agent_config, 106 default_auto_reply=default_auto_reply, (...) 109 skip_verify=skip_verify, 110 )

File ~/Documents/DataScience/installs/MemGPT/memgpt/autogen/memgpt_agent.py:184, in create_autogen_memgpt_agent(agent_config, skip_verify, interface, interface_kwargs, persistence_manager, persistence_manager_kwargs, default_auto_reply, is_termination_msg) 172 persistence_manager = LocalStateManager(**persistence_manager_kwargs) if persistence_manager is None else persistence_manager 174 memgpt_agent = presets.use_preset( 175 agent_config.preset, 176 agent_config, (...) 181 persistence_manager, 182 ) --> 184 autogen_memgpt_agent = MemGPTAgent( 185 name=agent_config.name, 186 agent=memgpt_agent, 187 default_auto_reply=default_auto_reply, 188 is_termination_msg=is_termination_msg, 189 skip_verify=skip_verify, 190 ) 191 return autogen_memgpt_agent

File ~/Documents/DataScience/installs/MemGPT/memgpt/autogen/memgpt_agent.py:204, in MemGPTAgent.init(self, name, agent, skip_verify, concat_other_agent_messages, is_termination_msg, default_auto_reply) 195 def init( 196 self, 197 name: str, (...) 202 default_auto_reply: Optional[Union[str, Dict, None]] = "", 203 ): --> 204 super().init(name) 205 self.agent = agent 206 self.skip_verify = skip_verify

File ~/.conda/envs/memgpt/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py:117, in ConversableAgent.init(self, name, system_message, is_termination_msg, max_consecutive_auto_reply, human_input_mode, function_map, code_execution_config, llm_config, default_auto_reply) 115 if isinstance(llm_config, dict): 116 self.llm_config.update(llm_config) --> 117 self.client = OpenAIWrapper(**self.llm_config) 119 self._code_execution_config: Union[Dict, Literal[False]] = ( 120 {} if code_execution_config is None else code_execution_config 121 ) 122 self.human_input_mode = human_input_mode

File ~/.conda/envs/memgpt/lib/python3.11/site-packages/autogen/oai/client.py:83, in OpenAIWrapper.init(self, config_list, base_config) 78 self._config_list = [ 79 {extra_kwargs, **{k: v for k, v in config.items() if k not in self.openai_kwargs}} 80 for config in config_list 81 ] 82 else: ---> 83 self._clients = [self._client(extra_kwargs, openai_config)] 84 self._config_list = [extra_kwargs]

File ~/.conda/envs/memgpt/lib/python3.11/site-packages/autogen/oai/client.py:138, in OpenAIWrapper._client(self, config, openai_config) 136 openai_config = {openai_config, {k: v for k, v in config.items() if k in self.openai_kwargs}} 137 self._process_for_azure(openai_config, config) --> 138 client = OpenAI(**openai_config) 139 return client

File ~/.conda/envs/memgpt/lib/python3.11/site-packages/openai/_client.py:92, in OpenAI.init(self, api_key, organization, base_url, timeout, max_retries, default_headers, default_query, http_client, _strict_response_validation) 90 api_key = os.environ.get("OPENAI_API_KEY") 91 if api_key is None: ---> 92 raise OpenAIError( 93 "The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable" ... 94 ) 95 self.api_key = api_key 97 if organization is None:

OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

cpacker commented 10 months ago

This error is happening because AutoGen uses the openai package (even when using local LLMs), and the openai package needs OPENAI_API_KEY to be set to something, even if it's a dummy variable.

First do:

export OPENAI_API_KEY="null"

Then do:

python agent_docs.py