When using a Langchain models like OpenAI and Gemini, HTTP calls going to the models online does not use the HTTP proxy defined for N8N. But go straight to the internet.
I found this as my N8N is not allowed to access internet directly. Allowing direct access resolves the issue
To Reproduce
Setup n8n to use a HTTP proxy using environmental variables HTTP_PROXY and HTTPS_PROXY.
Turn off access to internet except through proxy
Set up and run a simple AI flow using chat trigger, conversational agent and OpenAI model.
Expected behavior
Preferably, the AI model nodes should use the proxy defined for n8n. If not there should be a possibility to configure proxy separately for these nodes
Bug Description
When using a Langchain models like OpenAI and Gemini, HTTP calls going to the models online does not use the HTTP proxy defined for N8N. But go straight to the internet.
I found this as my N8N is not allowed to access internet directly. Allowing direct access resolves the issue
To Reproduce
Expected behavior
Preferably, the AI model nodes should use the proxy defined for n8n. If not there should be a possibility to configure proxy separately for these nodes
Operating System
Docker
n8n Version
1.59.3
Node.js Version
Provided by docker image
Database
SQLite (default)
Execution mode
main (default)