langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
92.05k stars 14.65k forks source link

Azure ChatOpenAI won't use proxy provided either via env var or value #19994

Closed FaresKi closed 2 days ago

FaresKi commented 5 months ago

Checked other resources

Example Code

from langchain_openai import AzureChatOpenAI
llm = AzureChatOpenAI(
                azure_endpoint=openai_api_base,  # type: ignore
                openai_api_version=openai_api_version,  # type: ignore
                azure_deployment=deployment_name,
                openai_api_key=openai_api_key,  # type: ignore
                openai_api_type=openai_api_type,  # type: ignore
                temperature=openai_llm_temp,
                callbacks=[handler],
                model_name="gpt-4-32k",
                openai_proxy=openai_proxy
 )

Error Message and Stack Trace (if applicable)

No exception or full stack trace, simply requests being timed out because they don't go thru our corporate proxy.

Description

I'm trying to tell AzureChatOpenAI to use our corporate proxy, however under langchain-openai it doesn't seem to take it in account. I've had to downgrade to use AzureChatOpenAI in langchain and downgrade the OpenAI package to respectively: langchain - 0.1.14 openai - 0.28.1

System Info

System Information
------------------
> OS:  Darwin
> OS Version:  Darwin Kernel Version 23.3.0: Wed Dec 20 21:31:00 PST 2023; root:xnu-10002.81.5~7/RELEASE_ARM64_T6020
> Python Version:  3.11.7 (main, Dec  4 2023, 18:10:11) [Clang 15.0.0 (clang-1500.1.0.2.5)]

Package Information
-------------------
> langchain_core: 0.1.39
> langchain: 0.1.14
> langchain_community: 0.0.31
> langsmith: 0.1.39
> langchain_openai: 0.0.8
> langchain_text_splitters: 0.0.1

Packages not installed (Not Necessarily a Problem)
--------------------------------------------------
The following packages were not found:

> langgraph
> langserve
Berber31 commented 3 months ago

Same issue.

workaround using "langchain-community==0.2.0", "langchain-openai==0.1.7" (thx this issue):

    _http_client = httpx.Client(proxy=proxy if proxy else None)
    _http_async_client = httpx.AsyncClient(proxy=proxy if proxy else None)

    azure_model = AzureChatOpenAI(
          openai_api_version="<api_v>",
          azure_endpoint=azure_configs["base_url"],
          azure_deployment=azure_configs["model_deployment"],
          model=azure_configs["model_name"],
          validate_base_url=False,
          http_client=_http_client,
          http_async_client=_http_async_client
      )