Portkey-AI / gateway

A Blazing Fast AI Gateway with integrated Guardrails. Route to 200+ LLMs, 50+ AI Guardrails with 1 fast & friendly API.
https://portkey.ai/features/ai-gateway
MIT License
6.14k stars 430 forks source link

How to integrate locally deployed models, such as the Qwen model deployed with vLLM, with an AI gateway? #621

Open wgimperial opened 1 month ago

wgimperial commented 1 month ago

What Happened?

I would like to use AI Gateway to manage a series of locally deployed models, but I'm not sure how to proceed.

What Should Have Happened?

No response

Relevant Code Snippet

No response

Your Twitter/LinkedIn

No response

narengogi commented 4 weeks ago

If your local models expose an api endpoint that is compatible with any of the already integrated providers, you can use them like this https://docs.portkey.ai/docs/integrations/llms/byollm otherwise you can write your own integration within the gateway (would love to have your contributions) if you have a custom endpoint

AI3721 commented 1 week ago

So there is no way to route to the local model API interface, i had to write my own code to integrate my model API according to the provider's architecture in the project

vrushankportkey commented 5 days ago

Hi @wgimperial please let us know if the doc shared by @narengogi above is useful!

@AI3721 - you can absolutely route to your local model with our Bring your own LLM integration - as long as you're able to expose the model over an OpenAI compatible API like /chat/completions.

AI3721 commented 5 days ago

This means that it can not be routed to an API other than this, for example, the COMFYUI API can not be invoked through this integration

vrushankportkey commented 5 days ago

Correct! If ComfyUI exposes endpoints like /images/generations (from here) though, you could absolutely route to it using Portkey.

I'll explore Portkey's integration with ComfyUI and see what's possible!

vrushankportkey commented 5 days ago

@AI3721 we also have the concept of "Gateway for other APIs" where you can just use Portkey as a proxy to call any target provider you want. Would that be useful?

narengogi commented 5 days ago

@AI3721 If your local model has an OpenAI compliant endpoint, you can use it without writing an integration. Here's an example with ollama: https://docs.portkey.ai/docs/integrations/llms/ollama

AI3721 commented 5 days ago

Okay, I see. I think I might have to write my own integration, thank you very much for your answers