FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
31.71k stars 16.55k forks source link

[FEATURE] Support proxy for Azure OpenAI Chat model #2348

Open Angappan95 opened 6 months ago

Angappan95 commented 6 months ago

Describe the feature you'd like Currently, I dont see support for connecting to Azure OpenAI services which is behind the Proxy layer. This is a limitation because many companies prefer this setup for security reasons. To access a OpenAI service via proxy, users need to configure proxy details (domain, signature, and CA certificate) within the httpx client object when creating an AzureChatOpenAI model instance. Integrating support for accessing Azure OpenAI with proxy configuration would be a valuable addition to Flowise.

vincenzomanzoni commented 6 months ago

I'm also interested in this topic. It is very common to run these kinds of applications in an enterprise context with HTTP-compatible proxies.

sunguangran commented 2 months ago

+1

gotoys commented 1 month ago

I will up that from my side !

HenryHengZJ commented 1 week ago

It should be possible now via a global proxy: https://github.com/FlowiseAI/Flowise/pull/3423

https://docs.flowiseai.com/configuration/running-flowise-behind-company-proxy

Can anyone try and see if that works?