Open lucapaone-ptvgroup opened 3 months ago
I also encountered this problem, how did you solve it
I also encountered this problem, how did you solve it
@livedhe it is not yet resolved
@sarahwooders an important note: Important note: I had added a simple text tool to the agent with a simple python code (the dice rolling example) at creation time
I'm also running into this, let me know if I can help in any way.
Describe the bug
Hi! I am Getting error on OPenAI completion aftaer some chat with the agent. Important note: I hada dded a simple text tool to the agent (the dice rolling example)
(memgpt_venv) root@PTVFCO-P119:~/memgpt_code# memgpt run --agent FlowsAgent2 --model gpt-4o --context-window 8192 --first 🔁 Using existing agent FlowsAgent2
An exception occurred when running agent.step(): Traceback (most recent call last): File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/main.py", line 424, in run_agent_loop new_messages, user_message, skip_next_user_input = process_agent_step(user_message, no_verify) File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/main.py", line 392, in process_agent_step new_messages, heartbeat_request, function_failed, token_warning, tokens_accumulated = memgpt_agent.step( File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/agent.py", line 739, in step raise e File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/agent.py", line 637, in step response = self._get_ai_reply( File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/agent.py", line 348, in _get_ai_reply raise e File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/agent.py", line 321, in _get_ai_reply response = create( File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 196, in wrapper return func(*args, **kwargs) File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 328, in create response = openai_chat_completions_request( File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/llm_api/openai.py", line 403, in openai_chat_completions_request raise http_err File "/root/memgpt_venv/lib/python3.10/site-packages/memgpt/llm_api/openai.py", line 393, in openai_chat_completions_request response.raise_for_status() # Raises HTTPError for 4XX/5XX status File "/root/memgpt_venv/lib/python3.10/site-packages/requests/models.py", line 1024, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.openai.com/v1/chat/completions
Please describe your setup
pip install pymemgpt
?pip install pymemgpt-nightly
?git clone
?memgpt
? (cmd.exe
/Powershell/Anaconda Shell/Terminal) , cmd.exe with wsl and crate a venvScreenshots If applicable, add screenshots to help explain your problem.
Additional context Add any other context about the problem here.
MemGPT Config Please attach your
~/.memgpt/config
file or copy past it below.[defaults] preset = memgpt_chat persona = sam_pov human = basic
[model] model = gpt-4: model_endpoint = https://api.openai.com/v1 model_endpoint_type = openai context_window = 8192 [embedding] embedding_endpoint_type = openai embedding_endpoint = https://api.openai.com/v1 embedding_model = text-embedding-ada-002 embedding_dim = 1536 embedding_chunk_size = 300
[archival_storage] type = chroma path = /root/.memgpt/chroma
[recall_storage] type = sqlite path = /root/.memgpt
[metadata_storage] type = sqlite path = /root/.memgpt
[version] memgpt_version = 0.3.21
[client] anon_clientid = 0000000000000000000079b2be0caee9
If you're not using OpenAI, please provide additional information on your local LLM setup: