geekan / MetaGPT

🌟 The Multi-Agent Framework: First AI Software Company, Towards Natural Language Programming
https://deepwisdom.ai/
MIT License
43.5k stars 5.18k forks source link

OPENAI_BASE_URL 和 OPENAI_PROXY无法同时生效? #692

Open butterl opened 8 months ago

butterl commented 8 months ago

使用场景:

  1. MetaGPT 服务所在的机器需要通过Proxy 才能访问Internet
  2. 使用自行搭建/三方搭建的 API service 中转

预期行为: 可以配置MetaGPT服务的网络代理 同时 可以设置API server 的自定义地址

实际测试:同时设置后网络访问失败

voidking commented 7 months ago
  1. 如果OPENAI_BASE_URL是中转地址,那么运行metagpt的机器和 OPENAI_BASE_URL 是连通的,不需要 OPENAI_PROXY。
  2. 如果 OPENAI_BASE_URL 是OPENAI 原地址,国内无法访问,才需要 OPENAI_PROXY,OPENAI_PROXY要能够连接到OPENAI 原地址。
  3. 同时配置 OPENAI_BASE_URL是中转地址和OPENAI_PROXY,相当于从 OPENAI_PROXY 访问 OPENAI_BASE_URL中转地址,可能是不通的,也可能是通的,取决于OPENAI_PROXY的网络情况。
butterl commented 7 months ago

通过 OPENAI_PROXY 设置的代理地址,其他的网络服务均可以放通

从正常逻辑上OPENAI_PROXY 的设置应该是类似 读取 env HTTPS_PROXY

voidking commented 7 months ago

是的,OPENAI_PROXY 的逻辑和环境变量中设置代理类似。 key.yaml中你是怎样配置 OPENAI_PROXY 的?贴一下配置内容吧。

butterl commented 7 months ago

OPENAI_BASE_URL: "https://cfwus02.opapi.win/v1" # 外网可以直接访问的API 地址 OPENAI_PROXY: "http://127.0.0.1:8080" # 内到外网的本地代理 OPENAI_API_KEY: "sk-KFCLFUv818xxxx" # API key OPENAI_API_MODEL: "gpt-4-1106-preview" # model

clipsheep6 commented 7 months ago

@voidking
any solution to this issue got same issue , working behind a proxy while using a 3rd party openai API URL

need use a system proxy to access internet before accessing the 3rd party openai API URL

geekan commented 5 months ago

@clipsheep6 @butterl Does this problem still exist?

butterl commented 5 months ago

@geekan 使用新的设置方式后还是返回失败,测试脚本在同环境下正常返回

  1. METAGPT setting in ~/.metagpt/config2.yaml

    llm:
    api_type: "openai"  # or azure / ollama / open_llm etc. Check LLMType for more options
    model: "gpt-3.5-turbo" #"gpt-4-turbo-preview"  # or gpt-3.5-turbo-1106 / gpt-4-1106-preview
    base_url: "https://aigptx.top/v1"  # or forward url / other llm url
    api_key: "sk-xxxxxx"
    proxy: "http://127.0.0.1:8080"
  2. below test code works well with proxy in $ENV

    
    import requests as requests
    import urllib3
    urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
    api_key = "sk-xxx"
    headers = {
    "Authorization": 'Bearer ' + api_key,
    }
    question = input("输入您的问题\n")
    params = {
    "messages": [
        {
            "role": 'user',
            "content": question
        }
    ],
    "model": 'gpt-3.5-turbo'
    }
    response = requests.post(
    "https://aigptx.top/v1/chat/completions",
    headers=headers,
    json=params,
    stream=False
    )
    res = response.json()
    res_content = res['choices'][0]['message']['content']
    print(res_content)