cpacker / MemGPT

Create LLM agents with long-term memory and custom tools 📚🦙
https://memgpt.readme.io
Apache License 2.0
10.85k stars 1.17k forks source link

Groq fails with: Failed to decode JSON output #1309

Open glindberg2000 opened 2 months ago

glindberg2000 commented 2 months ago

Describe the bug When using the Groq endpoint with MemGPT, I encounter an error that prevents successful conversation completion. The error arises during the parsing of JSON output from the local language model (LLM). The error message indicates a failure to decode valid MemGPT JSON from the LLM output.

Please describe your setup

DB: Postgres from local and memgpt Docker image, tried sqlite3 embedding_endpoint = https://embeddings.memgpt.ai embedding_model = BAAI/bge-large-en-v1.5 Model: llama3 7B and 70B

Screenshots image

Additional context The error occurs consistently after a few (sometimes longer) rounds of conversation using the Groq endpoint. The conversation starts successfully but fails during the parsing of JSON output from the LLM especially once it starts using function calls or getting near the context limit but the token limit errors (429) are separate from this JSON parsing error.


Local LLM details

sarahwooders commented 2 months ago

I believe this is a model issue - we are tying to work with better errors for model failures.

cpacker commented 2 months ago

Could potentially be fixed by #1257