cpacker / MemGPT

Letta (fka MemGPT) is a framework for creating stateful LLM services.
https://letta.com
Apache License 2.0
12.05k stars 1.33k forks source link

Getting blocking error on OpenAI completion after some chats with the agent, Agent/memgpt it is not able to go after that #1590

Open lucapaone-ptvgroup opened 3 months ago

lucapaone-ptvgroup commented 3 months ago

Describe the bug

Hi! I am Getting error on OPenAI completion aftaer some chat with the agent. Important note: I hada dded a simple text tool to the agent (the dice rolling example)

(memgpt_venv) root@PTVFCO-P119:~/memgpt_code# memgpt run --agent FlowsAgent2 --model gpt-4o --context-window 8192 --first 🔁 Using existing agent FlowsAgent2

Enter your message: hwo are you that is run : with

An exception occurred when running agent.step(): Traceback (most recent call last): File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/main.py", line 424, in run_agent_loop new_messages, user_message, skip_next_user_input = process_agent_step(user_message, no_verify) File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/main.py", line 392, in process_agent_step new_messages, heartbeat_request, function_failed, token_warning, tokens_accumulated = memgpt_agent.step( File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/agent.py", line 739, in step raise e File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/agent.py", line 637, in step response = self._get_ai_reply( File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/agent.py", line 348, in _get_ai_reply raise e File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/agent.py", line 321, in _get_ai_reply response = create( File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 196, in wrapper return func(*args, **kwargs) File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 328, in create response = openai_chat_completions_request( File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/llm_api/openai.py", line 403, in openai_chat_completions_request raise http_err File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/llm_api/openai.py", line 393, in openai_chat_completions_request response.raise_for_status() # Raises HTTPError for 4XX/5XX status File "/root/memgpt_venv/lib/python3.10/site-packages/requests/models.py", line 1024, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.openai.com/v1/chat/completions

Please describe your setup

Screenshots If applicable, add screenshots to help explain your problem.

Additional context Add any other context about the problem here.

MemGPT Config Please attach your ~/.memgpt/config file or copy past it below.

[defaults] preset = memgpt_chat persona = sam_pov human = basic

[model] model = gpt-4: model_endpoint = https://api.openai.com/v1 model_endpoint_type = openai context_window = 8192 [embedding] embedding_endpoint_type = openai embedding_endpoint = https://api.openai.com/v1 embedding_model = text-embedding-ada-002 embedding_dim = 1536 embedding_chunk_size = 300

[archival_storage] type = chroma path = /root/.memgpt/chroma

[recall_storage] type = sqlite path = /root/.memgpt

[metadata_storage] type = sqlite path = /root/.memgpt

[version] memgpt_version = 0.3.21

[client] anon_clientid = 0000000000000000000079b2be0caee9


If you're not using OpenAI, please provide additional information on your local LLM setup:

livedhe commented 3 months ago

I also encountered this problem, how did you solve it

lucapaone-ptvgroup commented 3 months ago

I also encountered this problem, how did you solve it

@livedhe it is not yet resolved

lucapaone-ptvgroup commented 3 months ago

@sarahwooders an important note: Important note: I had added a simple text tool to the agent with a simple python code (the dice rolling example) at creation time

gummihaf commented 2 months ago

I'm also running into this, let me know if I can help in any way.