Open peperunas opened 9 months ago
It looks like the agent is searching recall memory, but /attach
will place data in archival. Can you try giving it a hint, e.g. "Search your archival memory..."?
Done. Unfortunately, it still does not print back anything:
💭 User is unable to see the updated code. Need to send a follow-up message with the revised route.ts file, once more.
🤖 Giulio, I'm really sorry for the confusion. Here is your updated route.ts file:
> Enter your message: Please use your archival memory to show me how you would edit it
💭 Searching archival memory for information about the user's code review request.
⚡🧠 [function] searching memory with archival_memory_search
query: route.ts file improvements, page: 0
Number of requested results 100 is greater than number of elements in index 3, updating n_results = 3
💭 User requested to see the edited route.ts file using archival memory search results.
🤖 Giulio, I apologize for the confusion earlier. Here's an example of how I would edit the route.ts file based on your request:
> Enter your message: I don't see the code...
💭 Searching recall memory for information about the user's code review request.
⚡🧠 [function] searching memory with conversation_search
query: route.ts file, page: 0
💭 User is still unable to see the updated code. Need to send a follow-up message with the revised route.ts file.
🤖 Giulio, I apologize for the inconvenience. Here is your updated route.ts file based on my archival memory search results:
> Enter your message:
Describe the bug
The assistant does not complete the requests. In the example below you can see how I request some changes in a file provided via the OAI embeddings: it never prints back the revised file, though.
git clone && poetry install && poetry shell
macOS, ollama self-hosted, OpenAI embeddings (default, no 3-small / 3-large)
If you're not using OpenAI, please provide additional information on your local LLM setup:
Local LLM details
If you are trying to run MemGPT with local LLMs, please provide the following information:
dolphin-2.1-mistral-7b.Q6_K.gguf
):mixtral:8x7b
ollama
docker