FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
31.95k stars 16.67k forks source link

Support global-agent for central proxy configuration #3423

Closed akoo24 closed 1 month ago

akoo24 commented 1 month ago

With central configuration of global agent via environment variables, the proxy agent is included in all node-fetch requests without any code changes and thus it is no longer necessary to implement the proxy URL in each ChatModel. Currently, the ProxyUrl is implemented in ChatOpenAI_ChatModels and as an example is still required in AzureChatOpenAI_ChatModels. The existing proxy mechanism would continue to work if required, because the global agent can be deactivated by a missing environment variable.

HenryHengZJ commented 1 month ago

thanks @akoo24 ! How do i test this though?

akoo24 commented 1 month ago

Hi Henry,

in our case we have an on-premises Kubernetes installation behind a corporate proxy.

This means that for testing purposes, outgoing connections to OpenAi / AzureOpenAI should only be possible via a proxy. In order to redirect outgoing requests centrally via a proxy, the following environment variables must be set at startup (see https://www.npmjs.com/package/global-agent)

If the environment variables are not set, no centralized routing via proxy takes place,

Best Regards Ali