Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
22.92k stars 2.33k forks source link

[FEAT]: Improve Memory with Graphiti's temporal Knowledge Graph #2227

Open danielchalef opened 2 weeks ago

danielchalef commented 2 weeks ago

What would you like to see?

This is a really cool project! Memory management seems pretty primitive at this stage. I'm one of the authors of Graphiti, an open-source library for building and querying temporal Knowledge Graphs. We see personal assistants and agents as a core use case for Graphiti. If you're interested, we'd be happy to assist with integration.

luisgithub269 commented 1 day ago

Hi @danielchalef, I would like to thank you for your availability to review this architecture and libraries that increase the capabilities of AI agentic. Yesterday I was testing the functionality of the library and they exceeded my expectations in the replication of the exercises proposed for the reproducibility of the concepts. what they indicate from the library, in the repository and I also have a question when the functionality of using local models would be added, or taking the open format of OpenAI it can be integrated with local model, to review performance issues and use it with private information and computer equipment with limited capacities, I will be observing when this leap can be made and how it can add functionality beyond closed models and open the doors to innovation with other models.

danielchalef commented 1 day ago

@luisgithub269 You should be able to use Graphiti with any OpenAI-compatible inference endpoint. Just provide the correct base_url, key, and model name in a LLMConfig that you pass to the Graphiti OpenAIClient.