seanpixel / Teenage-AGI

MIT License
907 stars 111 forks source link

Mempuppy-AGI fork of Teenage-AGI #15

Open benjcooley opened 1 year ago

benjcooley commented 1 year ago

I spent some time on a fork of Teenage-AGI that works a little differently and seems to work quite nicely in retaining context, discovering facts and previous conversations, and just remembering who you specifically are and what you've talked about. Also tried to make it respond a bit more like GPT just chatting.

https://github.com/benjcooley/MemPuppy-AGI

It's pretty fun to chat with MemPuppy/Teenage as it remembers you and will bring up information about you, your interests and projects, and prior bits from other conversations you've had. Feels like talking to a buddy.

Just thought you might be interested.

seanpixel commented 1 year ago

I checked it out. Thank you for this, I will play around with it and possibly build on top of it.

benjcooley commented 1 year ago

Thanks for creating Teenage-AGI. This is something I really wanted to experiment with and Teenage-AGI is a great starting point.

Been experimenting for a while.

  1. The context still seems to make the chat a little stilted. It's less noticeable, but you can still tell that there's something else that seems to be inserting itself into the conversation.
  2. GPT can pick up facts and context better, but I feel that maybe a vector database might not be the best solution. In some cases it's very clear that certain subjects have been discussed, but the queries and facts GPT comes up with just don't match well enough with the vectors submitted. At the very least, perhaps a better internal prompt to produce more matchable facts and memories.
  3. After a while, the number of vector matches becomes a problem. Older vector matches show up in the search in random order, and it begins to get confusing for the internal prompt.
  4. Articles read from the web seem to be hard to match. It's likely a series of summaries, or descriptions submitted as individual pinecone records might be better. I'm wondering too if there might be a better embedding producer that is more topical.