cpacker / MemGPT

Letta (fka MemGPT) is a framework for creating stateful LLM services.
https://letta.com
Apache License 2.0
12.04k stars 1.33k forks source link

Crash on attempt to load agent #261

Closed danx0r closed 12 months ago

danx0r commented 12 months ago

Describe the bug Running webui backend, MemGPT commit 74d4a297d CLI Initial session goes well. Attempt to load agent in a subsequent session fails with:

Aborted. To Reproduce Steps to reproduce the behavior:

memgpt run --no_verify --debug
? Would you like to select an existing agent? Yes
? Select agent: agent_4
Using existing agent agent_4
State path: /home/ubuntu/.memgpt/agents/agent_4/agent_state
Persistent manager path: /home/ubuntu/.memgpt/agents/agent_4/persistence_manager
Index path: /home/ubuntu/.memgpt/agents/agent_4/persistence_manager/index

Aborted.

Expected behavior Chat with previously created agent_4

Screenshots If applicable, add screenshots to help explain your problem.

Additional context Add any other context about the problem here. Commented out some exception handling code to get a clearer understanding of what is going wrong. Stack trace:

 /home/ubuntu/MemGPT/memgpt/persistence_manager.py:123 in load                                    │
│                                                                                                  │
│   120 │   def load(filename, agent_config: AgentConfig):                                         │
│   121 │   │   """ Load a LocalStateManager from a file. """ ""                                   │
│   122 │   │   with open(filename, "rb") as f:                                                    │
│ ❱ 123 │   │   │   manager = pickle.load(f)                                                       │
│   124 │   │                                                                                      │
│   125 │   │   manager.archival_memory = LocalArchivalMemory(agent_config=agent_config)           │
│   126 │   │   return manager                                                                     │
│                                                                                                  │
│ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │
│ │ agent_config = AgentConfig()                                                                 │ │
│ │            f = <_io.BufferedReader                                                           │ │
│ │                name='/home/ubuntu/.memgpt/agents/agent_4/persistence_manager/2023-11-02_02_… │ │
│ │     filename = '/home/ubuntu/.memgpt/agents/agent_4/persistence_manager/2023-11-02_02_33_05… │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
EOFError: Ran out of input

How did you install MemGPT?

Your setup (please complete the following information)

Local LLM details

If you are trying to run MemGPT with local LLMs, please provide the following information:

Warning: no wrapper specified for local LLM, using the default wrapper 💭 My first interaction with a human. An exciting moment indeed. 🤖 Hello there! Welcome to your new companion. My name is Sam. How may I assist you today?

Enter your message: Hi! Please address me as danx0r, and remember that for our next session. 🧑 {'message': 'Hi! Please address me as danx0r, and remember that for our next session.', 'time': '2023-11-02 03:06:53 PM PDT-0700'} Warning: no wrapper specified for local LLM, using the default wrapper 💭 None ⚡🧠 [function] updating memory with core_memory_append Enter your message: My name is Danx0r 🧑 {'message': 'My name is Danx0r', 'time': '2023-11-02 03:07:12 PM PDT-0700'} Warning: no wrapper specified for local LLM, using the default wrapper 💭 Updating my memory. I am now aware that Danx0r is another way to refer to Chad. 🤖 Thank you for clarifying, Danx0r. Your preferences are important to me. Is there something specific you'd like to discuss today? Enter your message:

Cancelled by user

╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮ │ /home/ubuntu/MemGPT/memgpt/cli/cli.py:158 in run │ │ │ │ 155 │ │ configure_azure_support() │ │ 156 │ │ │ 157 │ loop = asyncio.get_event_loop() │ │ ❱ 158 │ loop.run_until_complete(run_agent_loop(memgpt_agent, first, no_verify, config, strip │ │ 159 │ │ │ │ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │ │ │ agent = None │ │ │ │ agent_config = AgentConfig() │ │ │ │ agent_files = [] │ │ │ │ agents = [] │ │ │ │ config = MemGPTConfig( │ │ │ │ │ config_path='/home/ubuntu/.memgpt/config', │ │ │ │ │ anon_clientid='000000000000000000002f90eae4860e', │ │ │ │ │ preset='memgpt_chat', │ │ │ │ │ model_endpoint='http://127.0.0.1:5000/', │ │ │ │ │ model='gpt-4', │ │ │ │ │ openai_key='abc', │ │ │ │ │ azure_key=None, │ │ │ │ │ azure_endpoint=None, │ │ │ │ │ azure_version=None, │ │ │ │ │ azure_deployment=None, │ │ │ │ │ azure_embedding_deployment=None, │ │ │ │ │ default_persona='sam', │ │ │ │ │ default_human='basic', │ │ │ │ │ default_agent=None, │ │ │ │ │ embedding_model='openai', │ │ │ │ │ embedding_dim=768, │ │ │ │ │ embedding_chunk_size=300, │ │ │ │ │ archival_storage_type='local', │ │ │ │ │ archival_storage_path=None, │ │ │ │ │ archival_storage_uri=None, │ │ │ │ │ recall_storage_type='local', │ │ │ │ │ recall_storage_path=None, │ │ │ │ │ recall_storage_uri=None, │ │ │ │ │ persistence_manager_type=None, │ │ │ │ │ persistence_manager_save_file=None, │ │ │ │ │ persistence_manager_uri=None │ │ │ │ ) │ │ │ │ data_source = None │ │ │ │ debug = False │ │ │ │ embed_model = HuggingFaceEmbedding( │ │ │ │ │ model_name='BAAI/bge-small-en-v1.5', │ │ │ │ │ embed_batch_size=10, │ │ │ │ │ callback_manager=<llama_index.callbacks.base.CallbackManager │ │ │ │ object at 0x7f5e8ac86b90>, │ │ │ │ │ tokenizer_name='BAAI/bge-small-en-v1.5', │ │ │ │ │ max_length=512, │ │ │ │ │ pooling=<Pooling.CLS: 'cls'>, │ │ │ │ │ normalize='True', │ │ │ │ │ query_instruction=None, │ │ │ │ │ text_instruction=None, │ │ │ │ │ cache_folder=None │ │ │ │ ) │ │ │ │ first = False │ │ │ │ human = None │ │ │ │ loop = <_UnixSelectorEventLoop running=False closed=False debug=False> │ │ │ │ memgpt_agent = <memgpt.agent.AgentAsync object at 0x7f5dd1b157b0> │ │ │ │ model = None │ │ │ │ no_verify = True │ │ │ │ original_stdout = <colorama.ansitowin32.StreamWrapper object at 0x7f5eb43aaa70> │ │ │ │ persistence_manager = <memgpt.persistence_manager.LocalStateManager object at │ │ │ │ 0x7f5dd1af71f0> │ │ │ │ persona = None │ │ │ │ preset = None │ │ │ │ run_agent_loop = <function run_agent_loop at 0x7f5e8b789480> │ │ │ │ service_context = ServiceContext( │ │ │ │ │ llm_predictor=LLMPredictor( │ │ │ │ │ │ system_prompt=None, │ │ │ │ │ │ query_wrapper_prompt=None, │ │ │ │ │ │ pydantic_program_mode=<PydanticProgramMode.DEFAULT: 'default'> │ │ │ │ │ ), │ │ │ │ │ prompt_helper=PromptHelper( │ │ │ │ │ │ context_window=3900, │ │ │ │ │ │ num_output=256, │ │ │ │ │ │ chunk_overlap_ratio=0.1, │ │ │ │ │ │ chunk_size_limit=None, │ │ │ │ │ │ separator=' ' │ │ │ │ │ ), │ │ │ │ │ embed_model=HuggingFaceEmbedding( │ │ │ │ │ │ model_name='BAAI/bge-small-en-v1.5', │ │ │ │ │ │ embed_batch_size=10, │ │ │ │ │ │ callback_manager=<llama_index.callbacks.base.CallbackManager │ │ │ │ object at 0x7f5e8ac86b90>, │ │ │ │ │ │ tokenizer_name='BAAI/bge-small-en-v1.5', │ │ │ │ │ │ max_length=512, │ │ │ │ │ │ pooling=<Pooling.CLS: 'cls'>, │ │ │ │ │ │ normalize='True', │ │ │ │ │ │ query_instruction=None, │ │ │ │ │ │ text_instruction=None, │ │ │ │ │ │ cache_folder=None │ │ │ │ │ ), │ │ │ │ │ node_parser=SimpleNodeParser( │ │ │ │ │ │ text_splitter=SentenceSplitter( │ │ │ │ │ │ │ chunk_size=300, │ │ │ │ │ │ │ chunk_overlap=20, │ │ │ │ │ │ │ separator=' ', │ │ │ │ │ │ │ paragraph_separator='\n\n\n', │ │ │ │ │ │ │ secondary_chunking_regex='[^,.;。?!]+[,.;。?!]?', │ │ │ │ │ │ │ chunking_tokenizer_fn=<function │ │ │ │ split_by_sentence_tokenizer..split at 0x7f5ddeb60280>, │ │ │ │ │ │ │ │ │ │ │ callback_manager=<llama_index.callbacks.base.CallbackManager object at │ │ │ │ 0x7f5e8ac86b90>, │ │ │ │ │ │ │ tokenizer=functools.partial(<bound method Encoding.encode │ │ │ │ of <Encoding 'gpt2'>>, allowed_special='all') │ │ │ │ │ │ ), │ │ │ │ │ │ include_metadata=True, │ │ │ │ │ │ include_prev_next_rel=True, │ │ │ │ │ │ metadata_extractor=None, │ │ │ │ │ │ callback_manager=<llama_index.callbacks.base.CallbackManager │ │ │ │ object at 0x7f5e8ac86b90> │ │ │ │ │ ), │ │ │ │ │ llama_logger=<llama_index.logger.base.LlamaLogger object at │ │ │ │ 0x7f5dd1af65f0>, │ │ │ │ │ callback_manager=<llama_index.callbacks.base.CallbackManager │ │ │ │ object at 0x7f5e8ac86b90> │ │ │ │ ) │ │ │ │ strip_ui = False │ │ │ │ yes = False │ │ │ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │ │ │ │ /usr/lib/python3.10/asyncio/base_events.py:649 in run_until_complete │ │ │ │ 646 │ │ if not future.done(): │ │ 647 │ │ │ raise RuntimeError('Event loop stopped before Future completed.') │ │ 648 │ │ │ │ ❱ 649 │ │ return future.result() │ │ 650 │ │ │ 651 │ def stop(self): │ │ 652 │ │ """Stop running the event loop. │ │ │ │ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │ │ │ future = <Task finished name='Task-115' coro=<run_agent_loop() done, defined at │ │ │ │ /home/ubuntu/MemGPT/memgpt/main.py:370> exception=TypeError("'NoneType' object is │ │ │ │ not callable")> │ │ │ │ new_task = True │ │ │ │ self = <_UnixSelectorEventLoop running=False closed=False debug=False> │ │ │ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │ │ │ │ /home/ubuntu/MemGPT/memgpt/main.py:438 in run_agent_loop │ │ │ │ 435 │ │ │ │ else: │ │ 436 │ │ │ │ │ # updated agent save functions │ │ 437 │ │ │ │ │ if user_input.lower() == "/exit": │ │ ❱ 438 │ │ │ │ │ │ memgpt_agent.save() │ │ 439 │ │ │ │ │ │ break │ │ 440 │ │ │ │ │ elif user_input.lower() == "/save" or user_input.lower() == "/savech │ │ 441 │ │ │ │ │ │ memgpt_agent.save() │ │ │ │ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │ │ │ cfg = MemGPTConfig( │ │ │ │ │ config_path='/home/ubuntu/.memgpt/config', │ │ │ │ │ anon_clientid='000000000000000000002f90eae4860e', │ │ │ │ │ preset='memgpt_chat', │ │ │ │ │ model_endpoint='http://127.0.0.1:5000/', │ │ │ │ │ model='gpt-4', │ │ │ │ │ openai_key='abc', │ │ │ │ │ azure_key=None, │ │ │ │ │ azure_endpoint=None, │ │ │ │ │ azure_version=None, │ │ │ │ │ azure_deployment=None, │ │ │ │ │ azure_embedding_deployment=None, │ │ │ │ │ default_persona='sam', │ │ │ │ │ default_human='basic', │ │ │ │ │ default_agent=None, │ │ │ │ │ embedding_model='openai', │ │ │ │ │ embedding_dim=768, │ │ │ │ │ embedding_chunk_size=300, │ │ │ │ │ archival_storage_type='local', │ │ │ │ │ archival_storage_path=None, │ │ │ │ │ archival_storage_uri=None, │ │ │ │ │ recall_storage_type='local', │ │ │ │ │ recall_storage_path=None, │ │ │ │ │ recall_storage_uri=None, │ │ │ │ │ persistence_manager_type=None, │ │ │ │ │ persistence_manager_save_file=None, │ │ │ │ │ persistence_manager_uri=None │ │ │ │ ) │ │ │ │ counter = 3 │ │ │ │ first = False │ │ │ │ legacy = False │ │ │ │ memgpt_agent = <memgpt.agent.AgentAsync object at 0x7f5dd1b157b0> │ │ │ │ multiline_input = False │ │ │ │ new_messages = [ │ │ │ │ │ { │ │ │ │ │ │ 'role': 'user', │ │ │ │ │ │ 'content': '{"type": "user_message", "message": "My name is │ │ │ │ Danx0r", "time": "2023-11-02 03:'+19 │ │ │ │ │ }, │ │ │ │ │ { │ │ │ │ │ │ 'role': 'assistant', │ │ │ │ │ │ 'content': 'Updating my memory. I am now aware that Danx0r is │ │ │ │ another way to refer to Chad.', │ │ │ │ │ │ 'function_call': { │ │ │ │ │ │ │ 'name': 'send_message', │ │ │ │ │ │ │ 'arguments': '{"message": "Thank you for clarifying, │ │ │ │ Danx0r. Your preferences are important to'+63 │ │ │ │ │ │ } │ │ │ │ │ }, │ │ │ │ │ { │ │ │ │ │ │ 'role': 'function', │ │ │ │ │ │ 'name': 'send_message', │ │ │ │ │ │ 'content': '{"status": "OK", "message": null, "time": │ │ │ │ "2023-11-02 03:07:18 PM PDT-0700"}' │ │ │ │ │ } │ │ │ │ ] │ │ │ │ no_verify = True │ │ │ │ process_agent_step = <function run_agent_loop..process_agent_step at │ │ │ │ 0x7f5dd11e2f80> │ │ │ │ skip_next_user_input = False │ │ │ │ status = <rich.status.Status object at 0x7f5dd10f0ca0> │ │ │ │ strip_ui = False │ │ │ │ USER_GOES_FIRST = False │ │ │ │ user_input = '/exit' │ │ │ │ user_message = '{"type": "user_message", "message": "My name is Danx0r", "time": │ │ │ │ "2023-11-02 03:'+19 │ │ │ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │ │ │ │ /home/ubuntu/MemGPT/memgpt/agent.py:359 in save │ │ │ │ 356 │ │ # save the persistence manager too │ │ 357 │ │ filename = f"{timestamp}.persistence.pickle" │ │ 358 │ │ os.makedirs(self.config.save_persistence_manager_dir(), exist_ok=True) │ │ ❱ 359 │ │ self.persistence_manager.save(os.path.join(self.config.save_persistencemanager │ │ 360 │ │ │ 361 │ @classmethod │ │ 362 │ def load_agent(cls, interface, agent_config: AgentConfig): │ │ │ │ ╭───────────────────────────── locals ──────────────────────────────╮ │ │ │ agent_name = 'agent_1' │ │ │ │ filename = '2023-11-02_03_07_25_PM_PDT-0700.persistence.pickle' │ │ │ │ self = <memgpt.agent.AgentAsync object at 0x7f5dd1b157b0> │ │ │ │ timestamp = '2023-11-02_03_07_25_PM_PDT-0700' │ │ │ ╰───────────────────────────────────────────────────────────────────╯ │ │ │ │ /home/ubuntu/MemGPT/memgpt/persistence_manager.py:133 in save │ │ │ │ 130 │ │ │ # TODO: fix this hacky solution to pickle the retriever │ │ 131 │ │ │ self.archival_memory.save() │ │ 132 │ │ │ self.archival_memory = None │ │ ❱ 133 │ │ │ pickle.dump(self, fh, protocol=pickle.HIGHEST_PROTOCOL) │ │ 134 │ │ │ │ │ 135 │ │ │ # re-load archival (TODO: dont do this) │ │ 136 │ │ │ self.archival_memory = LocalArchivalMemory(agent_config=self.agent_config) │ │ │ │ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │ │ │ fh = <_io.BufferedWriter │ │ │ │ name='/home/ubuntu/.memgpt/agents/agent_1/persistence_manager/2023-11-02_03_07_2… │ │ │ │ filename = '/home/ubuntu/.memgpt/agents/agent_1/persistence_manager/2023-11-02_03_07_25PM… │ │ │ │ self = <memgpt.persistence_manager.LocalStateManager object at 0x7f5dd1af71f0> │ │ │ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │ ╰──────────────────────────────────────────────────────────────────────────────────────────────────╯ TypeError: 'NoneType' object is not callable ubuntu@ip-172-31-40-1 ~/MemGPT (main) [1]>

ubuntu@ip-172-31-40-1 ~/MemGPT (main)> memgpt run --no_verify ? Would you like to select an existing agent? Yes ? Select agent: agent_1 Using existing agent agent_1

Aborted. ubuntu@ip-172-31-40-1 ~/MemGPT (main) [1]>

sarahwooders commented 12 months ago

Are you sure the state of agent_4 was properly saved? Did you run /save in the CLI during the previous chat, or maybe exit out during the save? It looks like the pickle file wasn't saved properly.

danx0r commented 12 months ago

OK -- I typed /save and got this result:

/home/ubuntu/MemGPT/memgpt/persistence_manager.py:133 in save                                    │
│                                                                                                  │
│   130 │   │   │   # TODO: fix this hacky solution to pickle the retriever                        │
│   131 │   │   │   self.archival_memory.save()                                                    │
│   132 │   │   │   self.archival_memory = None                                                    │
│ ❱ 133 │   │   │   pickle.dump(self, fh, protocol=pickle.HIGHEST_PROTOCOL)                        │
│   134 │   │   │                                                                                  │
│   135 │   │   │   # re-load archival (TODO: dont do this)                                        │
│   136 │   │   │   self.archival_memory = LocalArchivalMemory(agent_config=self.agent_config)     │
│                                                                                                  │
│ ╭─────────────────────────────────────────── locals ───────────────────────────────────────────╮ │
│ │       fh = <_io.BufferedWriter                                                               │ │
│ │            name='/home/ubuntu/.memgpt/agents/agent_1/persistence_manager/2023-11-02_03_15_2… │ │
│ │ filename = '/home/ubuntu/.memgpt/agents/agent_1/persistence_manager/2023-11-02_03_15_26_PM_… │ │
│ │     self = <memgpt.persistence_manager.LocalStateManager object at 0x7f65edf93220>           │ │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────╯ │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
TypeError: 'NoneType' object is not callable
danx0r commented 12 months ago

The offending attributes in self that break pickle are:

self.messages
self.all_messages
self.recall_memory
sarahwooders commented 12 months ago

Would you be able to try #226 to see if it fixes the issue?

danx0r commented 12 months ago

Running that PR locally gives me this error:

Exception: Got back an empty response string from http://127.0.0.1:5000/

I was not able to merge this to main BTW (lots of merge conflicts)

danx0r commented 12 months ago

Correction: it returns TypeError: 'NoneType' object is not callable as before (running the PR 226 git 89cf9761)

danx0r commented 12 months ago

fixed in https://github.com/cpacker/MemGPT/pull/300