cpacker / MemGPT

Create LLM agents with long-term memory and custom tools πŸ“šπŸ¦™
https://memgpt.readme.io
Apache License 2.0
11.32k stars 1.23k forks source link

Migrate to using Env Vars everywhere #1371

Open lenaxia opened 3 months ago

lenaxia commented 3 months ago

Describe the bug When mounting a config file into kubernetes pod at /root/.memgpt/config, program exits because it cannot write to the config file.

ubuntu@terraform:~/workspace/home-ops-prod/cluster/apps/home/localai/memgpt(⎈|prod:home)$ kcl memgpt-776b89b584-gh5km -c main
Starting MEMGPT server...
server :: loading configuration from '/root/.memgpt/config'
Traceback (most recent call last):
  File "/app/.venv/bin/uvicorn", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/app/.venv/lib/python3.11/site-packages/click/core.py", line 1157, in _call
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.11/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
 ...
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1204, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1176, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1147, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 690, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in callwith_frames_removed
  File "/memgpt/server/rest_api/server.py", line 46, in <module>
    server: SyncServer = SyncServer(default_interface=interface)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/memgpt/server/server.py", line 222, in __init
    self.config.save()
  File "/memgpt/config.py", line 272, in save
    with open(self.config_path, "w", encoding="utf-8") as f:
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
OSError: [Errno 30] Read-only file system: '/root/.memgpt/config'

Please describe your setup

Additional context Comment from Swooders in Discord:

swooders. β€” Today at 11:00 AM We did something a bit hacky to get the enviornment variable configuration (for postgres) to work for the docker container with minimal changes, where we override the config to point to the provided envs for postgres (basically so we could keep using the config file for DB configuration) -- but I think we need to change this and migrate to using envs everywhere else in the code istead fo reading the config file

MemGPT Config Please attach your ~/.memgpt/config file or copy past it below.

[defaults]
preset = memgpt_chat
persona = sam_pov
human = basic

[model]
model = neuralhermes-2.5-7b
model_endpoint = http://localai.home.svc.cluster.local:8080/v1
model_endpoint_type = openai
model_wrapper = null
context_window = 8192

[embedding]
embedding_endpoint_type = openai
embedding_endpoint = http://localai.home.svc.cluster.local:8080/v1
embedding_model = bert-embeddings
embedding_dim = 1536
embedding_chunk_size = 300

[archival_storage]
type = postgres
path = /root/.memgpt/chroma
uri = postgresql+pg8000://memgpt:adsfjkh*&^wer13@localhost:5432/memgpt

[recall_storage]
type = postgres
path = /root/.memgpt
uri = postgresql+pg8000://memgpt:adsfjkh*&^wer13@localhost:5432/memgpt

[metadata_storage]
type = postgres
path = /root/.memgpt
uri = postgresql+pg8000://memgpt:adsfjkh*&^wer13@localhost:5432/memgpt

[client]
anon_clientid = 00000000-0000-0000-0000-000000000000

If you're not using OpenAI, please provide additional information on your local LLM setup:

Local LLM details

If you are trying to run MemGPT with local LLMs, please provide the following information:

sarahwooders commented 3 months ago

I think we actually just need to remove this config overriding since the DB connectors are properly checking the env variables https://github.com/cpacker/MemGPT/blob/main/memgpt/server/server.py#L216