jgravelle / AutoGroq

AutoGroq is a groundbreaking tool that revolutionizes the way users interact with Autogen™ and other AI assistants. By dynamically generating tailored teams of AI agents based on your project requirements, AutoGroq eliminates the need for manual configuration and allows you to tackle any question, problem, or project with ease and efficiency.
https://autogroq.streamlit.app/
1.31k stars 440 forks source link

Feature Request - Choice to power from Local LLM server #11

Closed TSM-EVO closed 4 months ago

TSM-EVO commented 4 months ago

It would be very handy to have the ability to connect to a local LLM server like OLLAMA, to have a truly local solution capable of generating these agents.

jgravelle commented 4 months ago

I'll consider it, but it would be 7 to 10 times slower.

Thanks... -jjg

On Sun, May 12, 2024 at 10:18 PM Will H @.***> wrote:

It would be very handy to have the ability to connect to a local LLM server like OLLAMA, to have a truly local solution capable of generating these agents.

— Reply to this email directly, view it on GitHub https://github.com/jgravelle/AutoGroq/issues/11, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAZ6GXECIW2WIFPVX757O4TZCAWHBAVCNFSM6AAAAABHTMEXU2VHI2DSMVQWIX3LMV43ASLTON2WKOZSGI4TCNZRG43TCMI . You are receiving this because you are subscribed to this thread.Message ID: @.***>