binary-husky / gpt_academic

为GPT/GLM等LLM大语言模型提供实用化交互接口,特别优化论文阅读/润色/写作体验,模块化设计,支持自定义快捷按钮&函数插件,支持Python和C++等项目剖析&自译解功能,PDF/LaTex论文翻译&总结功能,支持并行问询多种LLM模型,支持chatglm3等本地模型。接入通义千问, deepseekcoder, 讯飞星火, 文心一言, llama2, rwkv, claude2, moss等。
https://github.com/binary-husky/gpt_academic/wiki/online
GNU General Public License v3.0
64.06k stars 7.91k forks source link

[Feature]: 通过反向代理接入openai chatgpt #900

Closed hongyi-zhao closed 1 year ago

hongyi-zhao commented 1 year ago

Class | 类型

大语言模型

Feature Request | 功能请求

有如下两个特性,非常不错:

  1. Cloudflare bypass proxy 支持,比如下面这个:https://github.com/acheong08/ChatGPT-Proxy-V4
  2. 基于1. 构建的反向代理,并利用Access token访问,从而绕过OpenAI API过期问题。
binary-husky commented 1 year ago

谢谢,不过鉴于目前api已经很便宜了,我不想再引入更多反向工程的东西了,因为反向工程的东西维护负担太重了…… 欢迎讨论


7.5 补充:

通过在config.py中添加以下配置可以使用 @acheong08 提供的反代项目

https://github.com/acheong08/ChatGPTProxy https://github.com/acheong08/ChatGPT-to-API/

API_URL_REDIRECT = {"https://api.openai.com/v1/chat/completions":"https://reverse-proxy-url/v1/chat/completions"}

hongyi-zhao commented 1 year ago

V1.py 是基于ChatGPT-Proxy-V4的具体实现,或许可以通过移植作为一个插件来集成到gpt_academic中。

acheong08 commented 1 year ago

There is no need to use ChatGPT-Proxy-V4 or add a dependency on revChatGPT.

To use ChatGPT for free, just swap the endpoint here: https://github.com/binary-husky/gpt_academic/blob/59877dd728bc71b30099acd74c706a43e3cd770c/request_llm/bridge_all.py#L55

To one hosted via https://github.com/acheong08/ChatGPT-to-API/ which serves as a clone of the official API but via chat.openai.com.

A free endpoint to try out: https://free.churchless.tech/v1/chat/completions

===================================================

Edit by binary husky: For simplicity, add this line to config.py or config_private.py will work:

API_URL_REDIRECT = {"https://api.openai.com/v1/chat/completions":"https://A-free-endpoint-to-try-out/v1/chat/completions"}
hongyi-zhao commented 1 year ago

Nice, it does the trick.

BTW, I still have some puzzles:

  1. Whether this method relies on the access token?
  2. Whether the access token is obtained based on the API key?
acheong08 commented 1 year ago

Whether this method relies on the access token?

yes

Whether the access token is obtained based on the API key?

No. Cycled through

hongyi-zhao commented 1 year ago

No. Cycled through

Could you please describe the detailed steps?

acheong08 commented 1 year ago

I upload a list of access tokens and it loops through them for each request. No API is used/accepted

acheong08 commented 1 year ago

https://github.com/acheong08/ChatGPT-to-API/blob/master/internal/tokens/tokens.go

hongyi-zhao commented 1 year ago

I upload a list of access tokens and it loops through them for each request. No API is used/accepted

This means that you must periodically check the expiration status of these access tokens and dynamically update them.

acheong08 commented 1 year ago

I have a bash script running in a for loop to update every week. It's all automated.

hongyi-zhao commented 1 year ago

Some additional related questions:

  1. One access token needs one account, so you must have many accounts.
  2. If I use the free endpoint provided by you, aka, https://free.churchless.tech/v1/chat/completions, then no api-key of mine is needed. But based on my tries, if I don't set the API_KEY variable when using the above free endpoint with https://github.com/binary-husky/gpt_academic, it doesn't work at all, as shown below:
werner@X10DAi:~$ gpt_academic 
 [PROXY] 网络代理状态:未配置。无代理状态下很可能无法访问OpenAI家族的模型。建议:检查USE_PROXY选项是否修改。 
 [API_KEY] 本项目现已支持OpenAI和API2D的api-key。也支持同时填写多个api-key,如API_KEY="openai-key1,openai-key2,api2d-key3" 
 [API_KEY] 您既可以在config.py中修改api-key(s),也可以在问题输入区输入临时的api-key(s),然后回车键提交后即可生效。 
 [API_KEY] 正确的 API_KEY 是'sk'开头的51位密钥(OpenAI),或者 'fk'开头的41位密钥,请在config文件中修改API密钥之后再运行。 
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
[ENV_VAR] 尝试加载API_URL_REDIRECT,默认值:{} --> 修正值:{"https://api.openai.com/v1/chat/completions": "https://free.churchless.tech/v1/chat/completions"}
 [ENV_VAR] 成功读取环境变量API_URL_REDIRECT 
[ENV_VAR] 尝试加载LLM_MODEL,默认值:gpt-3.5-turbo --> 修正值:gpt-3.5-turbo-16k
 [ENV_VAR] 成功读取环境变量LLM_MODEL 
所有问询记录将自动保存在本地目录./gpt_log/chat_secrets.log, 请注意自我隐私保护哦!
查询代理的地理位置,返回的结果是{}
代理配置 无, 代理所在地:China
如果浏览器没有自动打开,请复制并转到以下URL:
    (亮色主题): http://localhost:55087
    (暗色主题): http://localhost:55087/?__theme=dark
正在执行一些模块的预热...
正在加载tokenizer,如果是第一次运行,可能需要一点时间下载参数
自动更新程序:已禁用
加载tokenizer完毕
正在加载tokenizer,如果是第一次运行,可能需要一点时间下载参数
加载tokenizer完毕
Running on local URL:  http://0.0.0.0:55087

To create a public link, set `share=True` in `launch()`.
[GFX1-]: glxtest: VA-API test failed: no supported VAAPI profile found.
ATTENTION: default value of option mesa_glthread overridden by environment.
ATTENTION: default value of option mesa_glthread overridden by environment.
ATTENTION: default value of option mesa_glthread overridden by environment.

image

acheong08 commented 1 year ago

One access token needs one account, so you must have many accounts.

500 for public use and 500 for private use

acheong08 commented 1 year ago

If I use the free endpoint provided by you, aka, https://free.churchless.tech/v1/chat/completions, then no api-key of mine is needed. But based on my tries, if I don't set the API_KEY variable when using the above free endpoint with https://github.com/binary-husky/gpt_academic, it doesn't work at all, as shown below:

This is a front end thing. set randomly

hongyi-zhao commented 1 year ago

I still cannot understand what do you mean by saying that set randomly, more precisely, is it really necessary to set up this API_KEY when working via your free endpoint?

On the other hand, I also tried to call your revChatGPT.V3 as follows:

This way works:

$ API_URL="https://free.churchless.tech/v1/chat/completions" python -m revChatGPT.V3 --api_key sk-xxx --truncate_limit $(( 16 * 1024 )) --model gpt-3.5-turbo-16k

This way fails:

$ API_URL="https://free.churchless.tech/v1/chat/completions" python -m revChatGPT.V3 --truncate_limit $(( 16 * 1024 )) --model gpt-3.5-turbo-16k

The error message is as follows:

V3.py: error: the following arguments are required: --api_key
hongyi-zhao commented 1 year ago

I have a bash script running in a for loop to update every week. It's all automated.

Does this script call https://github.com/acheong08/OpenAIAuth under the hood?

500 for public use and 500 for private use

  1. How to create so many accounts automatically?
  2. What's the purpose of the 500 for private use?
acheong08 commented 1 year ago

On the other hand, I also tried to call your revChatGPT.V3 as follows:

V3 uses the official API. It is not relevant here

acheong08 commented 1 year ago

I still cannot understand what do you mean by saying that set randomly, more precisely, is it really necessary to set up this API_KEY when working via your free endpoint?

It is necessary to set an API key to use this specific repository. You can set something like blahblahblah and it would still work with the free endpoint

acheong08 commented 1 year ago
     How to create so many accounts automatically?

Browser automation with SMS verification from smspool

What's the purpose of the 500 for private use?

Automation stuff. Closed source

hongyi-zhao commented 1 year ago

V3 uses the official API. It is not relevant here

If so, why do you still read the environment variable as follows?

os.environ.get("API_URL") or "https://api.openai.com/v1/chat/completions",

hongyi-zhao commented 1 year ago

I upload a list of access tokens and it loops through them for each request. No API is used/accepted

This also means that each of these access tokens is precisely bound to one instance that provides the service, am I right?

hongyi-zhao commented 1 year ago

Browser automation with SMS verification from smspool

Are there some free smspool providers for such purpose? BTW, I noticed the following website, but not sure if it's truly free:

image

acheong08 commented 1 year ago

If so, why do you still read the environment variable as follows:

V3 uses the official API. Some people require proxies

This also means that each of these access tokens is precisely bound to one instance that provides the service, am I right?

Yes

Are there some free smspool providers for such purpose? BTW, I noticed the following website, but not sure if it's truly free:

None that works

hongyi-zhao commented 1 year ago

I still cannot understand what do you mean by saying that set randomly, more precisely, is it really necessary to set up this API_KEY when working via your free endpoint?

It is necessary to set an API key to use this specific repository. You can set something like blahblahblah and it would still work with the free endpoint

If so, why do you still read the environment variable as follows:

V3 uses the official API.

Based on my further tries, both gpt_academic and your revChatGPT.V3 work with an arbitrary fake formal api_key as follows, when calling your free endpoint:

werner@X10DAi:~$ echo sk-$(tr -dc A-Za-z0-9 </dev/urandom | head -c 48)
sk-JF7HaOK6K01wTNxR6pjoH1VB2uT58xdrDMFn6xAdlioOGmET

Some people require proxies

So, I come to the following question: with a customized API_URL, can I tweak V3 to use Email/Password based authentication, just as the one used by V1?

hongyi-zhao commented 1 year ago

None that works

Then, what's your solution for such a tedious job?

acheong08 commented 1 year ago

None that works

Then, what's your solution for such a tedious job?

I pay $0.1 per account for sms verification

acheong08 commented 1 year ago

So, I come to the following question: with a customized API_URL, can I tweak V3 to use Email/Password based authentication, just as the one used by V1?

Yes. If you include access token as API key in request to ChatGPT-to-API, it will use your access token instead of the built in ones

hongyi-zhao commented 1 year ago

I pay $0.1 per account for sms verification

What's your selected service provider?

hongyi-zhao commented 1 year ago

Another question: How to tweak your revChatGPT.V1 to work without or with an arbitrary fake formal api_key, when calling your free endpoint?

binary-husky commented 1 year ago

2. https://free.churchless.tech

The program automatically detect api key. Does it work if you simply try to use a random yet same-format api-key when using the free endpoint ?

acheong08 commented 1 year ago

Another question: How to tweak your revChatGPT.V1 work without or with an arbitrary fake formal api_key, when calling your free endpoint?

V1 cannot call free endpoint

acheong08 commented 1 year ago

I pay $0.1 per account for sms verification

What's your selected service provider?

smspool

acheong08 commented 1 year ago
  1. https://free.churchless.tech

The program automatically detect api key. Does it work if you simply try to use a random yet same-format api-key when using the free endpoint ?

The current hosted free endpoint gives you 1 request per second using the keys already hosted in the back end. You can set an API key sk-somethinghere and that would just be ignored, using a built in access token

hongyi-zhao commented 1 year ago

I've given the tested result corresponding to the above explanation.

hongyi-zhao commented 1 year ago

My original intention of this issue is not to advocate or suggest the use of free endpoints provided by others. This is because such an influx of access can easily overwhelm the aforementioned endpoints. Instead, the goal is to inform people to build their own free endpoints using a tool like ChatGPT-to-API, which is worth considering due to its superior safety, efficiency, and stability.

See https://github.com/acheong08/ChatGPT-to-API/issues/81 for the related discussion.

acheong08 commented 1 year ago

Replace the demo with your own endpoint. The endpoint provided is an instance of ChatGPT-to-API

binary-husky commented 1 year ago

Sorry, I misunderstood the intention of this issue.


发件人: Antonio Cheong @.> 发送时间: 2023年7月6日 09:17 收件人: binary-husky/gpt_academic @.> 抄送: binary-husky @.>; State change @.> 主题: Re: [binary-husky/gpt_academic] [Feature]: 通过免费的反向代理接入openai chatgpt (Issue #900)

Replace the demo with your own endpoint. The endpoint provided is an instance of ChatGPT-to-APIhttps://github.com/acheong08/ChatGPT-to-API/

― Reply to this email directly, view it on GitHubhttps://github.com/binary-husky/gpt_academic/issues/900#issuecomment-1622771121, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AW54NR7YGXHJBA5CTOIMBZDXOYG2ZANCNFSM6AAAAAAZPXSZBM. You are receiving this because you modified the open/close state.Message ID: @.***>

binary-husky commented 1 year ago

Sorry, I did not read this issue carefully and have previously misunderstood the intention of this issue.

Hope I'm not causing unnessary trouble, I apologize if there is any offense.

As remedy I will replace demo to fake url and try to write a document on how to use ChatGPT-to-APIhttps://github.com/acheong08/ChatGPT-to-API/


发件人: Antonio Cheong @.> 发送时间: 2023年7月6日 09:17 收件人: binary-husky/gpt_academic @.> 抄送: binary-husky @.>; State change @.> 主题: Re: [binary-husky/gpt_academic] [Feature]: 通过免费的反向代理接入openai chatgpt (Issue #900)

Replace the demo with your own endpoint. The endpoint provided is an instance of ChatGPT-to-APIhttps://github.com/acheong08/ChatGPT-to-API/

― Reply to this email directly, view it on GitHubhttps://github.com/binary-husky/gpt_academic/issues/900#issuecomment-1622771121, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AW54NR7YGXHJBA5CTOIMBZDXOYG2ZANCNFSM6AAAAAAZPXSZBM. You are receiving this because you modified the open/close state.Message ID: @.***>

binary-husky commented 1 year ago

@acheong08 hello, I'm trying to deploy ChatGPT-to-API but I'm encountering 500 or 404, this is my docker-compose setting, is it correct?

version: '3'

services:
  app:
    image: acheong08/chatgpt-to-api # 总是使用latest,更新时重新pull该tag镜像即可
    container_name: chatgpttoapi
    restart: unless-stopped
    ports:
      - '8080:8080'
    environment:
      SERVER_HOST: 0.0.0.0
      SERVER_PORT: 8080
      ADMIN_PASSWORD: TotallySecurePassword
      PUID: user-DwkoYzRgkApoWn2Yxxxxxxxxx
      http_proxy: socks5h://localhost:11284
      Access_Token: eyJhbGciOiJSUzI1Nxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

image

hongyi-zhao commented 1 year ago
  1. Do you really have a plus account?
  2. Remove PUID and try again, see here for the related comment.
  3. If you really want to use PUID, it should be obtained from your browser's cookie cache due that PUID is a cookie _puid and the description here, as shown below, is wrong:

image

binary-husky commented 1 year ago
  1. Do you really have a plus account?
  2. Remove PUID and try again, see here for the related comment.
  3. If you really want to use PUID, it should be obtained from your browser's cookie cache due that PUID is a cookie _puid and the description here, as shown below, is wrong:

image

  1. No, PUID=Plus_User_ID instead of Personal_User_ID?
  2. what is the alternative if I do not have a plus (if I have plus and bind a credit card, I could have simply used APIs)?
acheong08 commented 1 year ago

No, PUID=Plus_User_ID instead of Personal_User_ID?

yes

what is the alternative if I do not have a plus (if I have plus and bind a credit card, I could have simply used APIs)?

Plus not required. You just have to deal with rate limits. Alternative is proxies since it's IP based

hongyi-zhao commented 1 year ago

@binary-husky

2. what is the alternative if I do not have a plus (if I have plus and bind a credit card, I could have simply used APIs)?

I don't think so, due that OpenAI doesn't provide APIs even for plus accounts. Instead, the API access for the plus account is invited and managed by OpenAI, which doesn't ship with the plus account.

hongyi-zhao commented 1 year ago

@acheong08

Plus not required. You just have to deal with rate limits. Alternative is proxies since it's IP based

The key is to use a proxy pool managed by, say, haproxy, which also answered the question filed by me.

hongyi-zhao commented 1 year ago

@acheong08

So, it seems that the following comment in the template docker-compose.yml is not so accurate:

https://github.com/acheong08/ChatGPT-to-API/blob/091f2b4851aba597a5f47e1d0532ad3cf071b32d/docker-compose.yml#L16-L18

      # If the parameter API_REVERSE_PROXY is empty, the default request URL is https://chat.openai.com/backend-api/conversation, and the PUID is required.
      # You can get your PUID for Plus account from the following link: https://chat.openai.com/api/auth/session.
      PUID: xxx
binary-husky commented 1 year ago

Still have some problem, without PUID, should I use Access Token?

Should I pass Access_Token, AccessToken or accessToken?

Following configuration still return 500 error, the proxy network to US is tested to be fine, and the docker port projection is added as well:

version: '3'

services:
  app:
    image: acheong08/chatgpt-to-api # 总是使用latest,更新时重新pull该tag镜像即可
    container_name: chatgpttoapi
    restart: unless-stopped
    ports:
      - '8080:8080'
    environment:
      SERVER_HOST: 0.0.0.0
      SERVER_PORT: 8080
      ADMIN_PASSWORD: TotallySecurePassword
      http_proxy:  socks5h://docker.for.win.localhost:11284
      https_proxy: socks5h://docker.for.win.localhost:11284
      Access_Token: eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCIsImtpxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
hongyi-zhao commented 1 year ago

@acheong08 Just for confirmation, is the following bypass endpoint built with ChatGPTProxy?

https://github.com/acheong08/ChatGPT-to-API/blob/091f2b4851aba597a5f47e1d0532ad3cf071b32d/docker-compose.yml#L15
API_REVERSE_PROXY: https://bypass.churchless.tech/api/conversation

binary-husky commented 1 year ago

@binary-husky As described here, the access token retrieval has been automated. So, I think you should instead use OPENAI_EMAIL and OPENAI_PASSWORD, as documented here.

still no luck, give me some bad errors this time

image

acheong08 commented 1 year ago

Just for confirmation, is the following bypass endpoint built with ChatGPTProxy?

oops. it should've been removed. works standalone

acheong08 commented 1 year ago

docker version unmaintained. the binary is very lightweight.

acheong08 commented 1 year ago

binary built via github actions available in releases; https://github.com/acheong08/ChatGPT-to-API/releases/tag/1.5.2