Closed hongyi-zhao closed 1 year ago
谢谢,不过鉴于目前api已经很便宜了,我不想再引入更多反向工程的东西了,因为反向工程的东西维护负担太重了…… 欢迎讨论
7.5 补充:
通过在config.py中添加以下配置可以使用 @acheong08 提供的反代项目
https://github.com/acheong08/ChatGPTProxy https://github.com/acheong08/ChatGPT-to-API/
API_URL_REDIRECT = {"https://api.openai.com/v1/chat/completions":"https://reverse-proxy-url/v1/chat/completions"}
V1.py 是基于ChatGPT-Proxy-V4的具体实现,或许可以通过移植作为一个插件来集成到gpt_academic中。
There is no need to use ChatGPT-Proxy-V4 or add a dependency on revChatGPT.
To use ChatGPT for free, just swap the endpoint here: https://github.com/binary-husky/gpt_academic/blob/59877dd728bc71b30099acd74c706a43e3cd770c/request_llm/bridge_all.py#L55
To one hosted via https://github.com/acheong08/ChatGPT-to-API/ which serves as a clone of the official API but via chat.openai.com.
A free endpoint to try out: https://free.churchless.tech/v1/chat/completions
===================================================
Edit by binary husky: For simplicity, add this line to config.py or config_private.py will work:
API_URL_REDIRECT = {"https://api.openai.com/v1/chat/completions":"https://A-free-endpoint-to-try-out/v1/chat/completions"}
Nice, it does the trick.
BTW, I still have some puzzles:
Whether this method relies on the access token?
yes
Whether the access token is obtained based on the API key?
No. Cycled through
No. Cycled through
Could you please describe the detailed steps?
I upload a list of access tokens and it loops through them for each request. No API is used/accepted
I upload a list of access tokens and it loops through them for each request. No API is used/accepted
This means that you must periodically check the expiration status of these access tokens and dynamically update them.
I have a bash script running in a for loop to update every week. It's all automated.
Some additional related questions:
API_KEY
variable when using the above free endpoint with https://github.com/binary-husky/gpt_academic, it doesn't work at all, as shown below:werner@X10DAi:~$ gpt_academic
[PROXY] 网络代理状态:未配置。无代理状态下很可能无法访问OpenAI家族的模型。建议:检查USE_PROXY选项是否修改。
[API_KEY] 本项目现已支持OpenAI和API2D的api-key。也支持同时填写多个api-key,如API_KEY="openai-key1,openai-key2,api2d-key3"
[API_KEY] 您既可以在config.py中修改api-key(s),也可以在问题输入区输入临时的api-key(s),然后回车键提交后即可生效。
[API_KEY] 正确的 API_KEY 是'sk'开头的51位密钥(OpenAI),或者 'fk'开头的41位密钥,请在config文件中修改API密钥之后再运行。
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
[ENV_VAR] 尝试加载API_URL_REDIRECT,默认值:{} --> 修正值:{"https://api.openai.com/v1/chat/completions": "https://free.churchless.tech/v1/chat/completions"}
[ENV_VAR] 成功读取环境变量API_URL_REDIRECT
[ENV_VAR] 尝试加载LLM_MODEL,默认值:gpt-3.5-turbo --> 修正值:gpt-3.5-turbo-16k
[ENV_VAR] 成功读取环境变量LLM_MODEL
所有问询记录将自动保存在本地目录./gpt_log/chat_secrets.log, 请注意自我隐私保护哦!
查询代理的地理位置,返回的结果是{}
代理配置 无, 代理所在地:China
如果浏览器没有自动打开,请复制并转到以下URL:
(亮色主题): http://localhost:55087
(暗色主题): http://localhost:55087/?__theme=dark
正在执行一些模块的预热...
正在加载tokenizer,如果是第一次运行,可能需要一点时间下载参数
自动更新程序:已禁用
加载tokenizer完毕
正在加载tokenizer,如果是第一次运行,可能需要一点时间下载参数
加载tokenizer完毕
Running on local URL: http://0.0.0.0:55087
To create a public link, set `share=True` in `launch()`.
[GFX1-]: glxtest: VA-API test failed: no supported VAAPI profile found.
ATTENTION: default value of option mesa_glthread overridden by environment.
ATTENTION: default value of option mesa_glthread overridden by environment.
ATTENTION: default value of option mesa_glthread overridden by environment.
One access token needs one account, so you must have many accounts.
500 for public use and 500 for private use
If I use the free endpoint provided by you, aka, https://free.churchless.tech/v1/chat/completions, then no api-key of mine is needed. But based on my tries, if I don't set the API_KEY variable when using the above free endpoint with https://github.com/binary-husky/gpt_academic, it doesn't work at all, as shown below:
This is a front end thing. set randomly
I still cannot understand what do you mean by saying that set randomly
, more precisely, is it really necessary to set up this API_KEY
when working via your free endpoint?
On the other hand, I also tried to call your revChatGPT.V3
as follows:
This way works:
$ API_URL="https://free.churchless.tech/v1/chat/completions" python -m revChatGPT.V3 --api_key sk-xxx --truncate_limit $(( 16 * 1024 )) --model gpt-3.5-turbo-16k
This way fails:
$ API_URL="https://free.churchless.tech/v1/chat/completions" python -m revChatGPT.V3 --truncate_limit $(( 16 * 1024 )) --model gpt-3.5-turbo-16k
The error message is as follows:
V3.py: error: the following arguments are required: --api_key
I have a bash script running in a for loop to update every week. It's all automated.
Does this script call https://github.com/acheong08/OpenAIAuth under the hood?
500 for public use and 500 for private use
On the other hand, I also tried to call your revChatGPT.V3 as follows:
V3 uses the official API. It is not relevant here
I still cannot understand what do you mean by saying that set randomly, more precisely, is it really necessary to set up this API_KEY when working via your free endpoint?
It is necessary to set an API key to use this specific repository. You can set something like blahblahblah
and it would still work with the free endpoint
How to create so many accounts automatically?
Browser automation with SMS verification from smspool
What's the purpose of the 500 for private use?
Automation stuff. Closed source
V3 uses the official API. It is not relevant here
If so, why do you still read the environment variable as follows?
os.environ.get("API_URL") or "https://api.openai.com/v1/chat/completions",
I upload a list of access tokens and it loops through them for each request. No API is used/accepted
This also means that each of these access tokens is precisely bound to one instance that provides the service, am I right?
Browser automation with SMS verification from smspool
Are there some free smspool providers for such purpose? BTW, I noticed the following website, but not sure if it's truly free:
If so, why do you still read the environment variable as follows:
V3 uses the official API. Some people require proxies
This also means that each of these access tokens is precisely bound to one instance that provides the service, am I right?
Yes
Are there some free smspool providers for such purpose? BTW, I noticed the following website, but not sure if it's truly free:
None that works
I still cannot understand what do you mean by saying that set randomly, more precisely, is it really necessary to set up this API_KEY when working via your free endpoint?
It is necessary to set an API key to use this specific repository. You can set something like
blahblahblah
and it would still work with the free endpointIf so, why do you still read the environment variable as follows:
V3 uses the official API.
Based on my further tries, both gpt_academic
and your revChatGPT.V3
work with an arbitrary fake formal api_key as follows, when calling your free endpoint:
werner@X10DAi:~$ echo sk-$(tr -dc A-Za-z0-9 </dev/urandom | head -c 48)
sk-JF7HaOK6K01wTNxR6pjoH1VB2uT58xdrDMFn6xAdlioOGmET
Some people require proxies
So, I come to the following question: with a customized API_URL
, can I tweak V3 to use Email/Password based authentication, just as the one used by V1?
None that works
Then, what's your solution for such a tedious job?
None that works
Then, what's your solution for such a tedious job?
I pay $0.1 per account for sms verification
So, I come to the following question: with a customized API_URL, can I tweak V3 to use Email/Password based authentication, just as the one used by V1?
Yes. If you include access token as API key in request to ChatGPT-to-API, it will use your access token instead of the built in ones
I pay $0.1 per account for sms verification
What's your selected service provider?
Another question: How to tweak your revChatGPT.V1 to work without or with an arbitrary fake formal api_key, when calling your free endpoint?
The program automatically detect api key. Does it work if you simply try to use a random yet same-format api-key when using the free endpoint ?
Another question: How to tweak your revChatGPT.V1 work without or with an arbitrary fake formal api_key, when calling your free endpoint?
V1 cannot call free endpoint
I pay $0.1 per account for sms verification
What's your selected service provider?
smspool
The program automatically detect api key. Does it work if you simply try to use a random yet same-format api-key when using the free endpoint ?
The current hosted free endpoint gives you 1 request per second using the keys already hosted in the back end. You can set an API key sk-somethinghere
and that would just be ignored, using a built in access token
I've given the tested result corresponding to the above explanation.
My original intention of this issue is not to advocate or suggest the use of free endpoints provided by others. This is because such an influx of access can easily overwhelm the aforementioned endpoints. Instead, the goal is to inform people to build their own free endpoints using a tool like ChatGPT-to-API, which is worth considering due to its superior safety, efficiency, and stability.
See https://github.com/acheong08/ChatGPT-to-API/issues/81 for the related discussion.
Replace the demo with your own endpoint. The endpoint provided is an instance of ChatGPT-to-API
Sorry, I misunderstood the intention of this issue.
发件人: Antonio Cheong @.> 发送时间: 2023年7月6日 09:17 收件人: binary-husky/gpt_academic @.> 抄送: binary-husky @.>; State change @.> 主题: Re: [binary-husky/gpt_academic] [Feature]: 通过免费的反向代理接入openai chatgpt (Issue #900)
Replace the demo with your own endpoint. The endpoint provided is an instance of ChatGPT-to-APIhttps://github.com/acheong08/ChatGPT-to-API/
― Reply to this email directly, view it on GitHubhttps://github.com/binary-husky/gpt_academic/issues/900#issuecomment-1622771121, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AW54NR7YGXHJBA5CTOIMBZDXOYG2ZANCNFSM6AAAAAAZPXSZBM. You are receiving this because you modified the open/close state.Message ID: @.***>
Sorry, I did not read this issue carefully and have previously misunderstood the intention of this issue.
Hope I'm not causing unnessary trouble, I apologize if there is any offense.
As remedy I will replace demo to fake url and try to write a document on how to use ChatGPT-to-APIhttps://github.com/acheong08/ChatGPT-to-API/
发件人: Antonio Cheong @.> 发送时间: 2023年7月6日 09:17 收件人: binary-husky/gpt_academic @.> 抄送: binary-husky @.>; State change @.> 主题: Re: [binary-husky/gpt_academic] [Feature]: 通过免费的反向代理接入openai chatgpt (Issue #900)
Replace the demo with your own endpoint. The endpoint provided is an instance of ChatGPT-to-APIhttps://github.com/acheong08/ChatGPT-to-API/
― Reply to this email directly, view it on GitHubhttps://github.com/binary-husky/gpt_academic/issues/900#issuecomment-1622771121, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AW54NR7YGXHJBA5CTOIMBZDXOYG2ZANCNFSM6AAAAAAZPXSZBM. You are receiving this because you modified the open/close state.Message ID: @.***>
@acheong08 hello, I'm trying to deploy ChatGPT-to-API but I'm encountering 500 or 404, this is my docker-compose setting, is it correct?
version: '3'
services:
app:
image: acheong08/chatgpt-to-api # 总是使用latest,更新时重新pull该tag镜像即可
container_name: chatgpttoapi
restart: unless-stopped
ports:
- '8080:8080'
environment:
SERVER_HOST: 0.0.0.0
SERVER_PORT: 8080
ADMIN_PASSWORD: TotallySecurePassword
PUID: user-DwkoYzRgkApoWn2Yxxxxxxxxx
http_proxy: socks5h://localhost:11284
Access_Token: eyJhbGciOiJSUzI1Nxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
PUID
and try again, see here for the related comment. PUID
, it should be obtained from your browser's cookie cache due that PUID
is a cookie _puid
and the description here, as shown below, is wrong:
- Do you really have a plus account?
- Remove
PUID
and try again, see here for the related comment.- If you really want to use
PUID
, it should be obtained from your browser's cookie cache due thatPUID
is a cookie_puid
and the description here, as shown below, is wrong:
No, PUID=Plus_User_ID instead of Personal_User_ID?
yes
what is the alternative if I do not have a plus (if I have plus and bind a credit card, I could have simply used APIs)?
Plus not required. You just have to deal with rate limits. Alternative is proxies since it's IP based
@binary-husky
2. what is the alternative if I do not have a plus (if I have plus and bind a credit card, I could have simply used APIs)?
I don't think so, due that OpenAI doesn't provide APIs even for plus accounts. Instead, the API access for the plus account is invited and managed by OpenAI, which doesn't ship with the plus account.
@acheong08
Plus not required. You just have to deal with rate limits. Alternative is proxies since it's IP based
The key is to use a proxy pool managed by, say, haproxy, which also answered the question filed by me.
@acheong08
So, it seems that the following comment in the template docker-compose.yml is not so accurate:
# If the parameter API_REVERSE_PROXY is empty, the default request URL is https://chat.openai.com/backend-api/conversation, and the PUID is required.
# You can get your PUID for Plus account from the following link: https://chat.openai.com/api/auth/session.
PUID: xxx
Still have some problem, without PUID, should I use Access Token?
Should I pass Access_Token
, AccessToken
or accessToken
?
Following configuration still return 500 error, the proxy network to US is tested to be fine, and the docker port projection is added as well:
version: '3'
services:
app:
image: acheong08/chatgpt-to-api # 总是使用latest,更新时重新pull该tag镜像即可
container_name: chatgpttoapi
restart: unless-stopped
ports:
- '8080:8080'
environment:
SERVER_HOST: 0.0.0.0
SERVER_PORT: 8080
ADMIN_PASSWORD: TotallySecurePassword
http_proxy: socks5h://docker.for.win.localhost:11284
https_proxy: socks5h://docker.for.win.localhost:11284
Access_Token: eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCIsImtpxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
@acheong08 Just for confirmation, is the following bypass endpoint built with ChatGPTProxy?
https://github.com/acheong08/ChatGPT-to-API/blob/091f2b4851aba597a5f47e1d0532ad3cf071b32d/docker-compose.yml#L15
API_REVERSE_PROXY: https://bypass.churchless.tech/api/conversation
Just for confirmation, is the following bypass endpoint built with ChatGPTProxy?
oops. it should've been removed. works standalone
docker version unmaintained. the binary is very lightweight.
binary built via github actions available in releases; https://github.com/acheong08/ChatGPT-to-API/releases/tag/1.5.2
Class | 类型
大语言模型
Feature Request | 功能请求
有如下两个特性,非常不错: