Closed akoo24 closed 1 month ago
thanks @akoo24 ! How do i test this though?
Hi Henry,
in our case we have an on-premises Kubernetes installation behind a corporate proxy.
This means that for testing purposes, outgoing connections to OpenAi / AzureOpenAI should only be possible via a proxy. In order to redirect outgoing requests centrally via a proxy, the following environment variables must be set at startup (see https://www.npmjs.com/package/global-agent)
GLOBAL_AGENT_HTTP_PROXY: CorporateProxyUrl
GLOBAL_AGENT_HTTPS_PROXY: CorporateProxyUrl
GLOBAL_AGENT_NO_PROXY: Exception hosts to bypass proxy if needed
If the environment variables are not set, no centralized routing via proxy takes place,
Best Regards Ali
With central configuration of global agent via environment variables, the proxy agent is included in all node-fetch requests without any code changes and thus it is no longer necessary to implement the proxy URL in each ChatModel. Currently, the ProxyUrl is implemented in ChatOpenAI_ChatModels and as an example is still required in AzureChatOpenAI_ChatModels. The existing proxy mechanism would continue to work if required, because the global agent can be deactivated by a missing environment variable.