BerriAI / litellm

Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.19k stars 1.42k forks source link

[Feature]: add litellm.proxy, like openai.proxy #540

Closed ishaan-jaff closed 8 months ago

ishaan-jaff commented 11 months ago

The Feature

some repos use openai.proxy and we can't entirely replace openai for their usage

example

    def __init_openai(self, config):
        if self.proxy != '':
            openai.proxy = self.proxy

Motivation, pitch

-

Twitter / LinkedIn details

No response

krrishdholakia commented 11 months ago

i'm confused @ishaan-jaff how are we unable to help b/c of this? are you suggesting a module variable called proxy?

Tbh i'm not clear on what openai.proxy does v/s. api_base

ishaan-jaff commented 11 months ago

we fail for the repo when you try to set litellm.proxy because we don't even have a variable like that

ishaan-jaff commented 11 months ago

that's a good starting place ^

If we're a drop in replacement for openai and openai.proxy exists we should probably have the same functionality for the variable litellm.proxy

krrishdholakia commented 11 months ago

That's fair - what does proxy do though? besides openai do we send it to any other provider?

usathyan commented 11 months ago

can you provide a network proxy for openai calls? many companies have their firewalls and use proxy to reach hosted models!

ishaan-jaff commented 11 months ago

@usathyan we already do this you can try it here: https://docs.litellm.ai/docs/proxy_server

usathyan commented 11 months ago

This is different. I am not asking for openai API translator for other hosted models. I am looking for a network proxy tunnel through company firewalls to reach azure openai models

On Mon, Oct 9, 2023, 9:00 PM Ishaan Jaff @.***> wrote:

@usathyan https://github.com/usathyan we already do this you can try it here: https://docs.litellm.ai/docs/proxy_server

— Reply to this email directly, view it on GitHub https://github.com/BerriAI/litellm/issues/540#issuecomment-1754145244, or unsubscribe https://github.com/notifications/unsubscribe-auth/AACGO5HYNTVEZZ5KO5KWBBLX6SM3VAVCNFSM6AAAAAA5U4D626VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTONJUGE2DKMRUGQ . You are receiving this because you were mentioned.Message ID: @.***>

ishaan-jaff commented 11 months ago

@usathyan can you help me better understand what you're hoping for litellm to do ? Would love to add what you're looking for to litellm

Our proxy allows you to create a server for calling azure openai

litellm --model azure/chat-gpt
usathyan commented 11 months ago

Well, here is the situation - I am interested in using open-interpreter( https://github.com/KillianLucas/open-interpreter). I found out that that project uses liteLLM. The situation is that in order for me to use an open-interpreter requires me to route all network traffic via a http proxy server. for example - the openai endpoint is only accessible via the network proxy. see here for reference - https://github.com/openai/openai-node/tree/v4#configuring-an-https-agent-eg-for-proxies

Thanks!

On Mon, Oct 9, 2023 at 9:27 PM Ishaan Jaff @.***> wrote:

@usathyan https://github.com/usathyan can you help me better understand what you're hoping for litellm to do ? Would love to add what you're looking for to litellm

Our proxy allows you to create a server for calling azure openai

litellm --model azure/chat-gpt

— Reply to this email directly, view it on GitHub https://github.com/BerriAI/litellm/issues/540#issuecomment-1754164032, or unsubscribe https://github.com/notifications/unsubscribe-auth/AACGO5FFVUPHGOQGV6AEIT3X6SQAZAVCNFSM6AAAAAA5U4D626VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTONJUGE3DIMBTGI . You are receiving this because you were mentioned.Message ID: @.***>

ishaan-jaff commented 11 months ago

got it @usathyan I should have something ready for you in 24 hrs

ishaan-jaff commented 11 months ago

here's my understanding of what youre looking for:

Open interpreter -> Proxy -> OpenAI GPT-4

The proxy should make your OpenAI GPT-4 calls for you

Do you want us to help you create the proxy for you ? or do you simply just want to set litellm.proxy and make the call ? @usathyan

usathyan commented 11 months ago

Yes, please!

On Mon, Oct 9, 2023, 11:36 PM Ishaan Jaff @.***> wrote:

here's my understanding of what youre looking for:

Open interpreter -> Proxy -> OpenAI GPT-4

The proxy should make your OpenAI GPT-4 calls for you

Do you want us to help you create the proxy for you ? or do you simply just want to set litellm.proxy and make the call ? @usathyan https://github.com/usathyan

— Reply to this email directly, view it on GitHub https://github.com/BerriAI/litellm/issues/540#issuecomment-1754302791, or unsubscribe https://github.com/notifications/unsubscribe-auth/AACGO5C5EUVVNIASOZTJOP3X6S7FDAVCNFSM6AAAAAA5U4D626VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTONJUGMYDENZZGE . You are receiving this because you were mentioned.Message ID: @.***>

usathyan commented 11 months ago

Hello me find all images to edit and tweak to make it work

On Mon, Oct 9, 2023, 11:36 PM Ishaan Jaff @.***> wrote:

here's my understanding of what youre looking for:

Open interpreter -> Proxy -> OpenAI GPT-4

The proxy should make your OpenAI GPT-4 calls for you

Do you want us to help you create the proxy for you ? or do you simply just want to set litellm.proxy and make the call ? @usathyan https://github.com/usathyan

— Reply to this email directly, view it on GitHub https://github.com/BerriAI/litellm/issues/540#issuecomment-1754302791, or unsubscribe https://github.com/notifications/unsubscribe-auth/AACGO5C5EUVVNIASOZTJOP3X6S7FDAVCNFSM6AAAAAA5U4D626VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTONJUGMYDENZZGE . You are receiving this because you were mentioned.Message ID: @.***>

usathyan commented 11 months ago

Sorry. Help me find

On Tue, Oct 10, 2023, 6:35 AM Umesh Bhatt @.***> wrote:

Hello me find all images to edit and tweak to make it work

On Mon, Oct 9, 2023, 11:36 PM Ishaan Jaff @.***> wrote:

here's my understanding of what youre looking for:

Open interpreter -> Proxy -> OpenAI GPT-4

The proxy should make your OpenAI GPT-4 calls for you

Do you want us to help you create the proxy for you ? or do you simply just want to set litellm.proxy and make the call ? @usathyan https://github.com/usathyan

— Reply to this email directly, view it on GitHub https://github.com/BerriAI/litellm/issues/540#issuecomment-1754302791, or unsubscribe https://github.com/notifications/unsubscribe-auth/AACGO5C5EUVVNIASOZTJOP3X6S7FDAVCNFSM6AAAAAA5U4D626VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTONJUGMYDENZZGE . You are receiving this because you were mentioned.Message ID: @.***>

usathyan commented 8 months ago

Thank you!