rainchen / dify-tool-LongTermMemory

a Dify tool for storing and retrieving long-term-memory, using Dify built-in Knowledge dataset for storing memories, each user has a standalone long-term-memory space
MIT License
38 stars 4 forks source link

Do I need to install docker? #6

Open jorgearone opened 1 month ago

jorgearone commented 1 month ago

Hello again, I tell you that so far I have not been able to use it, it generates this doubt, the last time you told me that I did not need to install the docker that was enough the Dify cloud, however I can not understand thi asdfasdfddfd s direction.

rainchen commented 1 month ago

The idea is let the Dify Code Node to request Dify's Knowledge API(see https://docs.dify.ai/guides/knowledge-base/maintain-dataset-via-api) So if you're using Dify Cloud, you should use cloud version API https://api.dify.ai/ as the base_url parameter value. If you're using "Deploy with Docker Compose" solution, it dependents:

jorgearone commented 2 weeks ago

I finally decided to host in a vps. but I found it curious that it only works with the DeepSeek provider. When I used the same model through the Openrouter it gave me an error with the credentials. in the end I had to enable the provider directly, what do you think happened?

rainchen commented 1 week ago

Did you try creating a new chatbot using Openrouter provider and working fine? The requirement for the LLM is that its performance must be equivalent to that of GPT-4 or DeepSeek. Should not be limited to the DeepSeek provider.