cpacker / MemGPT

Create LLM agents with long-term memory and custom tools 📚🦙
https://memgpt.readme.io
Apache License 2.0
11.27k stars 1.23k forks source link

All the data is gone on the next startup #227

Closed SkySlider closed 9 months ago

SkySlider commented 10 months ago

I did the first run with memgpt, chose default Sam and basic human Chad, updated it with my data through conversation, several writes to memory were made by the bot. But on the next startup I ran the suggested command "memgpt run" instead of just "memgpt" and all the data is gone.

I checked the .memgpt folder(which is in the default Users folder when running cmd without admin rights, I'm on Windows), some folders inside were dated with the time (20:23) of initial conversation, while others with the time of next startup (memgpt prompted the initial setup again(openai key, persona pick, etc...) without ever mentioning the past conversation, after it Sam didn't remember me).

image

archival/humans/personas folders are empty, agents have only folder "agent_1" created on second startup, saved state only has files of a state of second startup. There's no trace of the first conversation except the date on the folder.

Did it store the data elsewhere? I really want to bring back the old conversation. Yes, I'm certain that the initial conversation took place with cmd in Users\"User" folder.

It also concerns me if running memgpt as written in README.md, without the "run" after it doesn't save the data. It tells that I'm running the legacy run command(Legacy CLI), but no mention of my data not being saved.

ProjCRys commented 10 months ago

Currently if you want to access you're previous discussions, you would have to manually /save and /load the json of the previous discussions. However, I also have problems loading mine since it crashes or kept saying ran out of input. I tried asking about this in a QnA in the discussions so I'm still waiting for some clarifications on how memgpt clarifies to solve this problem.

SkySlider commented 10 months ago

Currently if you want to access you're previous discussions, you would have to manually /save and /load the json of the previous discussions. However, I also have problems loading mine since it crashes or kept saying ran out of input. I tried asking about this in a QnA in the discussions so I'm still waiting for some clarifications on how memgpt clarifies to solve this problem.

When starting with memgpt run it does save jsons by itself and on the next startup asks if I would like to select an existing agent. That's how it works for me on Win10.

ProjCRys commented 10 months ago

Currently if you want to access you're previous discussions, you would have to manually /save and /load the json of the previous discussions. However, I also have problems loading mine since it crashes or kept saying ran out of input. I tried asking about this in a QnA in the discussions so I'm still waiting for some clarifications on how memgpt clarifies to solve this problem.

When starting with memgpt run it does save the jsons by itself and asks if I would you like to select an existing agent on next startup. At least that's how it works for me on Win10.

I tried doing that but this is what I got: ? Would you like to select an existing agent? No Creating new agent... Created new agent agent_5. Hit enter to begin (will request first MemGPT message) [A[K Warning: no wrapper specified for local LLM, using the default wrapper step() failed user_message = None error = Failed to decode JSON from LLM output: { {"status": "OK", "message": null, "time": "2023-11-01 06:05:36 AM PDT-0700"}} Failed to decode JSON from LLM output: { {"status": "OK", "message": null, "time": "2023-11-01 06:05:36 AM PDT-0700"}} An exception ocurred when running agent.step(): Traceback (most recent call last): File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\local_llm\llm_chat_completion_wrappers\airoboros.py", line 392, in output_to_chat_completion_response function_json_output = json.loads(raw_llm_output) File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\json__init__.py", line 346, in loads return _default_decoder.decode(s) File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\json\decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\json\decoder.py", line 353, in raw_decode obj, end = self.scan_once(s, idx) json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 3 (char 2)

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\local_llm\llm_chat_completion_wrappers\airoboros.py", line 395, in output_to_chat_completion_response function_json_output = json.loads(raw_llm_output + "\n}") File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\json__init__.py", line 346, in loads return _default_decoder.decode(s) File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\json\decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\json\decoder.py", line 353, in raw_decode obj, end = self.scan_once(s, idx) json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 3 (char 2)

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\main.py", line 526, in run_agent_loop ) = await memgpt_agent.step(user_message, first_message=False, skip_verify=no_verify) File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\agent.py", line 1084, in step raise e File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\agent.py", line 1020, in step response = await get_ai_reply_async(model=self.model, message_sequence=input_message_sequence, functions=self.functions) File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\agent.py", line 160, in get_ai_reply_async raise e File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\agent.py", line 141, in get_ai_reply_async response = await acreate( File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\openai_tools.py", line 115, in wrapper raise e File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\openai_tools.py", line 95, in wrapper return await func(*args, kwargs) File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\openai_tools.py", line 124, in acompletions_with_backoff return get_chat_completion(kwargs) File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\local_llm\chat_completion_proxy.py", line 62, in get_chat_completion chat_completion_result = llm_wrapper.output_to_chat_completion_response(result) File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\local_llm\llm_chat_completion_wrappers\airoboros.py", line 397, in output_to_chat_completion_response raise Exception(f"Failed to decode JSON from LLM output:\n{raw_llm_output}") Exception: Failed to decode JSON from LLM output: { {"status": "OK", "message": null, "time": "2023-11-01 06:05:36 AM PDT-0700"}} ? Retry agent.step()? (Y/n)


bat file I made to run a venv @echo off

:: Define the paths to Python and the venv set PYTHON_EXECUTABLE=C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\python.exe set VENV_DIR=D:\AI\ChatBots\MemGPT_Setup\venv

:: Create the virtual environment "%PYTHON_EXECUTABLE%" -m venv "%VENV_DIR%"

:: Check if the virtual environment creation was successful if %errorlevel% neq 0 ( echo An error occurred while creating the virtual environment. Press any key to exit. pause >nul exit /b %errorlevel% )

:: Activate the virtual environment call "%VENV_DIR%\Scripts\activate"

:: Install pymemgpt using pip pip install pymemgpt pip install transformers pip install torch

:: Check if the installation was successful if %errorlevel% neq 0 ( echo An error occurred while installing pymemgpt. Press any key to exit. pause >nul exit /b %errorlevel% )

set OPENAI_API_BASE=http://localhost:1234 set BACKEND_TYPE=lmstudio cls

:: Run memgpt (replace this with your specific memgpt command) memgpt run

:: Check if memgpt encountered an error if %errorlevel% neq 0 ( echo An error occurred while running memgpt. Press any key to exit. pause >nul exit /b %errorlevel% )

:: Deactivate the virtual environment deactivate

:: Pause to allow the user to review the output echo Press any key to exit. pause >nul

sarahwooders commented 10 months ago

Currently if you want to access you're previous discussions, you would have to manually /save and /load the json of the previous discussions. However, I also have problems loading mine since it crashes or kept saying ran out of input. I tried asking about this in a QnA in the discussions so I'm still waiting for some clarifications on how memgpt clarifies to solve this problem.

When starting with memgpt run it does save the jsons by itself and asks if I would you like to select an existing agent on next startup. At least that's how it works for me on Win10.

I tried doing that but this is what I got: ? Would you like to select an existing agent? No Creating new agent... Created new agent agent_5. Hit enter to begin (will request first MemGPT message) [A[K Warning: no wrapper specified for local LLM, using the default wrapper step() failed user_message = None error = Failed to decode JSON from LLM output: { {"status": "OK", "message": null, "time": "2023-11-01 06:05:36 AM PDT-0700"}} Failed to decode JSON from LLM output: { {"status": "OK", "message": null, "time": "2023-11-01 06:05:36 AM PDT-0700"}} An exception ocurred when running agent.step(): Traceback (most recent call last): File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\local_llm\llm_chat_completion_wrappers\airoboros.py", line 392, in output_to_chat_completion_response function_json_output = json.loads(raw_llm_output) File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\jsoninit.py", line 346, in loads return _default_decoder.decode(s) File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\json\decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\json\decoder.py", line 353, in raw_decode obj, end = self.scan_once(s, idx) json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 3 (char 2)

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\local_llm\llm_chat_completion_wrappers\airoboros.py", line 395, in output_to_chat_completion_response function_json_output = json.loads(raw_llm_output + "\n}") File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\jsoninit.py", line 346, in loads return _default_decoder.decode(s) File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\json\decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\lib\json\decoder.py", line 353, in raw_decode obj, end = self.scan_once(s, idx) json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 3 (char 2)

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\main.py", line 526, in run_agent_loop ) = await memgpt_agent.step(user_message, first_message=False, skip_verify=no_verify) File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\agent.py", line 1084, in step raise e File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\agent.py", line 1020, in step response = await get_ai_reply_async(model=self.model, message_sequence=input_message_sequence, functions=self.functions) File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\agent.py", line 160, in get_ai_reply_async raise e File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\agent.py", line 141, in get_ai_reply_async response = await acreate( File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\openai_tools.py", line 115, in wrapper raise e File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\openai_tools.py", line 95, in wrapper return await func(*args, kwargs) File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\openai_tools.py", line 124, in acompletions_with_backoff return get_chat_completion(kwargs) File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\local_llm\chat_completion_proxy.py", line 62, in get_chat_completion chat_completion_result = llm_wrapper.output_to_chat_completion_response(result) File "D:\AI\ChatBots\MemGPT_Setup\venv\lib\site-packages\memgpt\local_llm\llm_chat_completion_wrappers\airoboros.py", line 397, in output_to_chat_completion_response raise Exception(f"Failed to decode JSON from LLM output:\n{raw_llm_output}") Exception: Failed to decode JSON from LLM output: { {"status": "OK", "message": null, "time": "2023-11-01 06:05:36 AM PDT-0700"}} ? Retry agent.step()? (Y/n)

bat file I made to run a venv @echo off

:: Define the paths to Python and the venv set PYTHON_EXECUTABLE=C:\Users\ADMIN\AppData\Local\Programs\Python\Python310\python.exe set VENV_DIR=D:\AI\ChatBots\MemGPT_Setup\venv

:: Create the virtual environment "%PYTHON_EXECUTABLE%" -m venv "%VENV_DIR%"

:: Check if the virtual environment creation was successful if %errorlevel% neq 0 ( echo An error occurred while creating the virtual environment. Press any key to exit. pause >nul exit /b %errorlevel% )

:: Activate the virtual environment call "%VENV_DIR%\Scripts\activate"

:: Install pymemgpt using pip pip install pymemgpt pip install transformers pip install torch

:: Check if the installation was successful if %errorlevel% neq 0 ( echo An error occurred while installing pymemgpt. Press any key to exit. pause >nul exit /b %errorlevel% )

set OPENAI_API_BASE=http://localhost:1234 set BACKEND_TYPE=lmstudio cls

:: Run memgpt (replace this with your specific memgpt command) memgpt run

:: Check if memgpt encountered an error if %errorlevel% neq 0 ( echo An error occurred while running memgpt. Press any key to exit. pause >nul exit /b %errorlevel% )

:: Deactivate the virtual environment deactivate

:: Pause to allow the user to review the output echo Press any key to exit. pause >nul

This looks like an error with communicating with the LLM @cpacker, not a data loading issue.

cpacker commented 9 months ago

Closing b/c this should be resolved in the latest version(s), please create a new issue or reopen if you're still having issues.