cpacker / MemGPT

Letta (fka MemGPT) is a framework for creating stateful LLM services.
https://letta.com
Apache License 2.0
12.05k stars 1.33k forks source link

Failed to create agent #1530

Open tiro2000 opened 3 months ago

tiro2000 commented 3 months ago

Describe the bug admin = Admin(base_url="http://localhost:8283", token='UmsbhmPlbh9U3MdOSFmLCg') client = create_client(base_url="http://localhost:8283", token='UmsbhmPlbh9U3MdOSFmLCg') # tried also using user Token Key custom_agent = client.create_agent( name="custom_agent", human=human, persona=persona )
--------------- ERROR at Terminal -------------------------------------------- \backend\memgpt_test.py", line 307, in main custom_agent = client.createagent( ^^^^^^^^^^^^^^^^^^^^ File "C:\Users\king\AIProjects\MemGPT\OsteoGPT\ui\backend\OsteoGPT\Lib\site-packages\memgpt\client\client.py", line 283, in create_agent raise ValueError(f"Status {response.status_code} - Failed to create agent: {response.text}") ValueError: Status 500 - Failed to create agent: Internal Server Error

Please describe your setup

Screenshots If applicable, add screenshots to help explain your problem.

Server output :

ERROR: Exception in ASGI application Traceback (most recent call last): File "C:\Users\king_\memgpt\memgpt_env\Lib\site-packages\uvicorn\protocols\http\httptools_impl.py", line 399, in runasgi result = await app( # type: ignore[func-returns-value] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\uvicorn\middleware\proxyheaders.py", line 70, in call return await self.app(scope, receive, send) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\king\memgpt\memgptenv\Lib\site-packages\fastapi\applications.py", line 1054, in call await super().call(scope, receive, send) File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\starlette\applications.py", line 123, in call await self.middlewarestack(scope, receive, send) File "C:\Users\king\memgpt\memgptenv\Lib\site-packages\starlette\middleware\errors.py", line 186, in call raise exc File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\starlette\middleware\errors.py", line 164, in call await self.app(scope, receive, send) File "C:\Users\king\memgpt\memgptenv\Lib\site-packages\starlette\middleware\cors.py", line 85, in call await self.app(scope, receive, send) File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\starlette\middleware\exceptions.py", line 65, in call await wrap_app_handlingexceptions(self.app, conn)(scope, receive, send) File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\starlette_exception_handler.py", line 64, in wrappedapp raise exc File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\starlette_exception_handler.py", line 53, in wrappedapp await app(scope, receive, sender) File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\starlette\routing.py", line 756, in call await self.middlewarestack(scope, receive, send) File "C:\Users\king\memgpt\memgptenv\Lib\site-packages\starlette\routing.py", line 776, in app await route.handle(scope, receive, send) File "C:\Users\king\memgpt\memgptenv\Lib\site-packages\starlette\routing.py", line 297, in handle await self.app(scope, receive, send) File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\starlette\routing.py", line 77, in app await wrap_app_handlingexceptions(app, request)(scope, receive, send) File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\starlette_exception_handler.py", line 64, in wrappedapp raise exc File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\starlette_exception_handler.py", line 53, in wrappedapp await app(scope, receive, sender) File "C:\Users\king\memgpt\memgptenv\Lib\site-packages\starlette\routing.py", line 72, in app response = await func(request) ^^^^^^^^^^^^^^^^^^^ File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\fastapi\routing.py", line 278, in app raw_response = await run_endpointfunction( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\fastapi\routing.py", line 193, in run_endpoint_function return await run_inthreadpool(dependant.call, **values) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\starlette\concurrency.py", line 42, in run_in_threadpool return await anyio.to_thread.runsync(func, *args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\anyio\to_thread.py", line 56, in run_sync return await get_async_backend().run_sync_in_workerthread( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\king\memgpt\memgpt_env\Lib\site-packages\anyio_backends_asyncio.py", line 2177, in run_sync_in_workerthread return await future ^^^^^^^^^^^^ File "C:\Users\king\memgpt\memgptenv\Lib\site-packages\anyio_backends_asyncio.py", line 859, in run result = context.run(func, *args) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\king\MemGPT\memgpt\server\rest_api\agents\index.py", line 74, in create_agent tool_names = request.config["function_names"]


KeyError: 'function_names'
INFO:     ::1:51106 - "POST /api/agents HTTP/1.1" 500 Internal Server Error
**Additional context**
Add any other context about the problem here.

**MemGPT Config**
Please attach your `~/.memgpt/config` file or copy past it below.
[defaults]
preset = memgpt_chat
persona = sam_pov
human = basic

[model]
model = gpt-4
model_endpoint = https://api.openai.com/v1
model_endpoint_type = openai
context_window = 8192

[embedding]
embedding_endpoint_type = openai
embedding_endpoint = https://api.openai.com/v1
embedding_model = text-embedding-ada-002
embedding_dim = 1536
embedding_chunk_size = 300

[archival_storage]
type = chroma
path = C:\Users\king_\.memgpt\chroma

[recall_storage]
type = sqlite
path = C:\Users\king_\.memgpt

[metadata_storage]
type = sqlite
path = C:\Users\king_\.memgpt

[version]
memgpt_version = 0.3.18

[client]
anon_clientid = 00000000-0000-0000-0000-000000000000

---

If you're not using OpenAI, please provide additional information on your local LLM setup:

**Local LLM details**

If you are trying to run MemGPT with local LLMs, please provide the following information:

- [ ] The exact model you're trying to use (e.g. `dolphin-2.1-mistral-7b.Q6_K.gguf`)
- [ ] The local LLM backend you are using (web UI? LM Studio?)
- [ ] Your hardware for the local LLM backend (local computer? operating system? remote RunPod?)
tiro2000 commented 3 months ago

@sarahwooders @cpacker using CLI >memgpt run create the agent automatically , but trying to use the client code will produce previous listed error ( even with basic Agent ).

sarahwooders commented 3 months ago

Can you see if you still have this issue on the latest version on main?

osehmathias commented 3 months ago

Yes, I do.