Open jorgearone opened 1 month ago
The idea is let the Dify Code Node to request Dify's Knowledge API(see https://docs.dify.ai/guides/knowledge-base/maintain-dataset-via-api)
So if you're using Dify Cloud, you should use cloud version API https://api.dify.ai/
as the base_url parameter value.
If you're using "Deploy with Docker Compose" solution, it dependents:
http://host.docker.internal
on MacOS/Linuxhttp://docker.for.win.localhost
on Windows 10/11I finally decided to host in a vps. but I found it curious that it only works with the DeepSeek provider. When I used the same model through the Openrouter it gave me an error with the credentials. in the end I had to enable the provider directly, what do you think happened?
Did you try creating a new chatbot using Openrouter provider and working fine? The requirement for the LLM is that its performance must be equivalent to that of GPT-4 or DeepSeek. Should not be limited to the DeepSeek provider.
Hello again, I tell you that so far I have not been able to use it, it generates this doubt, the last time you told me that I did not need to install the docker that was enough the Dify cloud, however I can not understand thi s direction.