danny-avila / LibreChat

Enhanced ChatGPT Clone: Features Anthropic, AWS, OpenAI, Assistants API, Azure, Groq, o1, GPT-4o, Mistral, OpenRouter, Vertex AI, Gemini, Artifacts, AI model switching, message search, langchain, DALL-E-3, ChatGPT Plugins, OpenAI Functions, Secure Multi-User System, Presets, completely open-source for self-hosting. Actively in public development.
https://librechat.ai/
MIT License
17.38k stars 2.89k forks source link

Enhancement: LibreChat Agents #3607

Open danny-avila opened 1 month ago

danny-avila commented 1 month ago

What features would you like to see added?

Open-source alternative to Assistants API, as a successor to the "Plugins" endpoint, with support for Mistral, AWS Bedrock, Anthropic, OpenAI, Azure OpenAI Services, and more.

More details

Tweet announcing this: https://twitter.com/LibreChatAI/status/1821195627830599895

Which components are impacted by your request?

Frontend/backend

Code of Conduct

SamirSaidani commented 1 month ago

Regarding the modular system, it would be nice to be able to put all assistant tool logic into a single folder, accompanied by a JSON manifest. This would facilitate the creation of a librechat agents store, where each agent/assistant such as WeatherAssistant/, can be neatly organized and managed.

kele527 commented 1 month ago

I want to say is not agent, but prompt, I want to say, can prompt management consult https://github.com/lobehub/lobe-chat, lobe on the do is better, and built a lot of useful prompt. Also, librechat's prompt seems to be sent via chat rather than system prompt, hopefully supporting system prompt

PederHP commented 1 month ago

I would greatly prefer a way to use tools that is not tied to the assistant concept. Most LLMs today support tools as part of the interface, so I would prefer the baseline tool support to be in the preset. I really don't need more than a way to provide a JSON manifest for each tool in preset and then to be able to implement the callback myself - either by supplying a url to call or the actual server-side implementation.

Assistants are fine, but I think they're an additional abstraction on top and one of the things I love about LibreChat is that it doesn't force me to use abstractions beyond those in the baseline LLM API contracts.

danny-avila commented 1 month ago

I would greatly prefer a way to use tools that is not tied to the assistant concept. Most LLMs today support tools as part of the interface, so I would prefer the baseline tool support to be in the preset. I really don't need more than a way to provide a JSON manifest for each tool in preset and then to be able to implement the callback myself - either by supplying a url to call or the actual server-side implementation.

Assistants are fine, but I think they're an additional abstraction on top and one of the things I love about LibreChat is that it doesn't force me to use abstractions beyond those in the baseline LLM API contracts.

Thanks for your comment! This will be a successor to the plugins endpoint, which does not have that additional abstraction you mention. I want to allow “tools” to be toggled on and off per “run”, on the fly, as before.

Though it’s worth mentioning, this is an essence an agent without an association to the agent data structure and Plugins also functions like this under the hood. As soon as the LLM has “agency” of tools it crosses that territory.

berry-13 commented 1 month ago

I want to say is not agent, but prompt, I want to say, can prompt management consult https://github.com/lobehub/lobe-chat, lobe on the do is better, and built a lot of useful prompt. Also, librechat's prompt seems to be sent via chat rather than system prompt, hopefully supporting system prompt

Since day one, LibreChat has let users work with system prompts using presets, right after the OpenAI API launched in March or April 2023. With our upcoming update, we want to make this feature even more noticeable and easier to use. Also, think of "Agent" as a more straightforward way to describe this idea, since it basically uses "instructions" that act like a system prompt

bfogels1 commented 1 month ago

Is there planned integration of agents with locally hosted LLM's with platforms such as ollama?

danny-avila commented 1 month ago

Is there planned integration of agents with locally hosted LLM's with platforms such as ollama?

Yes! All ollama models that support tool-calling