VRSEN / agency-swarm

The only reliable agent framework built on top of the latest OpenAI Assistants API.
https://vrsen.github.io/agency-swarm/
MIT License
2.04k stars 530 forks source link

Needs Guide for Self Hosted LLMs #148

Open haltingstate opened 1 week ago

haltingstate commented 1 week ago

Hi! I learned about Agency-Swarm from your videos. They are very good, thank you.

You made a video, saying that there is open source LLM support. Is there a guide to using locally hosted LLMs?

We have fine tuned LLMs for specific tasks we need to use, but agency-swarm uses OpenAI assistants API, which makes it difficult to use with 3rd party LLMs or even mistrall.

Could you consider wrapping or eliminating the usage of openai assistants API in agency swarm, or making it optional.

samuelmukoti commented 1 week ago

Hi @haltingstate,

You should watch this YouTube video: https://www.youtube.com/watch?v=Vd-Gtfm_zjw&t=885s, where VRSEN discusses how agency-swarm supports local LLMs. They are using an open-source library called “open-assistant-api,” which allows your Mistral LLM to mimic OpenAI’s assistants API.

Hope that helps!

windowsagent commented 1 week ago

Hey! I was attempting to use agency-swarm with local LLMs (specifically codestral), and I followed your guide. Unfortunately, I ran into an error I can't solve at all, whenever agency-swarm tries to ping platform.openai.com (see the image below) photo1718247197

I can't get past this, and I've already did the step with calling the set_openai_client() function, and made sure I was only calling the codestral model in all my agents. I would appreciate any help you could provide on this.

phact commented 1 week ago

@windowsagent, today you can use agency-swarm with astra-assistants which makes it easy to use with third party LLMs. Here's an example https://github.com/datastax/astra-assistants-api/blob/main/examples/python/agency-swarm/basic.py

codestral should work if you use the litellm model name and creds env var from here: https://litellm.vercel.app/docs/providers/mistral#supported-models

haltingstate commented 1 week ago

What about running with vLLM?

phact commented 4 days ago

@haltingstate yes, litellm supports vLLM so astra-assistants does too since assitants uses the litellm library internally. You'd have to pass a special header though LLM-PARAM-base-url with the url to your vLLM server.

You can do this with custom_headers when you create your openai client before you patch it:

client = OpenAI(
    default_headers= {
        "LLM-PARAM-base-url": URL,
    }
);

I made an issue to make this simpler (with an env var) https://github.com/datastax/astra-assistants-api/issues/49

VRSEN commented 3 days ago

Thanks for adding this example, @phact. I'll test it out and add it to our docs shortly: https://vrsen.github.io/agency-swarm/advanced-usage/open-source-models/