Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
23.57k stars 2.38k forks source link

[FEAT]: Question about connection an API to AnythingLLM #2349

Open sonnt-dna opened 6 days ago

sonnt-dna commented 6 days ago

What would you like to see?

I would like to know if Anything LLM supports a feature that allows integration with external APIs, specifically Dify API. I already have the API URL and API key from Dify, and I want to use this API to handle all chat interactions instead of the default LLM model in Anything LLM. Specifically, I'm looking for the following:

API Integration: The ability to input the Dify API URL and API key into the system and establish a connection between Anything LLM and Dify's services.

Redirect Requests: I want all incoming queries from the current chat interface to be automatically sent to the Dify API, and the responses from Dify should seamlessly display in the chat interface.

Security: The integration should ensure the API key is securely stored and protected.

Configurable: The admin interface should provide an option to enable/disable the use of the Dify API and easily update the API URL and key when needed.

Logging and Monitoring: There should be a logging mechanism to track the requests sent to the Dify API and the responses received for monitoring and troubleshooting.

Does Anything LLM support such a feature or is there a way to integrate an external API like Dify for chat interactions?

timothycarambat commented 5 days ago

I am not intimately familiar with DIfy, but if the endpoint is the same request/response format as OpenAI you can use the OpenAI Generic Connector in our "LLM Preference" selection and use the correct base URL and params and you should be able to use Dify as your LLM.

Everything except Logging and Monitoring will be present, that is a separate piece of work we are looking to add and is on the roadmp