lss233 / chatgpt-mirai-qq-bot

🚀 一键部署!真正的 AI 聊天机器人!支持ChatGPT、文心一言、讯飞星火、Bing、Bard、ChatGLM、POE,多账号,人设调教,虚拟女仆、图片渲染、语音发送 | 支持 QQ、Telegram、Discord、微信 等平台
GNU Affero General Public License v3.0
13.4k stars 1.57k forks source link

[BUG] OpenAI: <!DOCTYPE html> #929

Closed pasdy1 closed 1 year ago

pasdy1 commented 1 year ago

提交 issue 前,请先确认:

表现
描述 BUG 的表现情况

运行环境:

截图
chatgpt-qq-chatgpt-1 | 2023-06-07 12:45:17.829 | DEBUG | platforms.onebot_bot:_:148 - 私聊消息:你好 chatgpt-qq-chatgpt-1 | 2023-06-07 12:45:17.831 | DEBUG | middlewares.concurrentlock:handle_request:25 - [Concurrent] 使用 Adapter 内部的 Queue chatgpt-qq-chatgpt-1 | 2023-06-07 12:45:17.832 | DEBUG | middlewares.concurrentlock:handle_request:27 - [Concurrent] 排队中,前面还有 0 个人! chatgpt-qq-chatgpt-1 | 2023-06-07 12:45:17.832 | DEBUG | middlewares.concurrentlock:handle_request:40 - [Concurrent] 排队中,前面还有 0 个人! chatgpt-qq-chatgpt-1 | 2023-06-07 12:45:17.832 | DEBUG | middlewares.concurrentlock:handle_request:42 - [Concurrent] 排到了! chatgpt-qq-chatgpt-1 | 2023-06-07 12:45:17.859 | DEBUG | middlewares.timeout:create_timeout_task:55 - [Timeout] 开始计时…… chatgpt-qq-chatgpt-1 | 2023-06-07 12:45:20.260 | ERROR | universal:handle_message:295 - OpenAI: <!DOCTYPE html> chatgpt-qq-chatgpt-1 | <html lang="en-US"> chatgpt-qq-chatgpt-1 | <head> chatgpt-qq-chatgpt-1 | <title>Just a moment...</title> chatgpt-qq-chatgpt-1 | <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"> chatgpt-qq-chatgpt-1 | <meta http-equiv="X-UA-Compatible" content="IE=Edge"> chatgpt-qq-chatgpt-1 | <meta name="robots" content="noindex,nofollow"> chatgpt-qq-chatgpt-1 | <meta name="viewport" content="width=device-width,initial-scale=1"> chatgpt-qq-chatgpt-1 | <link href="/cdn-cgi/styles/challenges.css" rel="stylesheet"> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | </head> chatgpt-qq-chatgpt-1 | <body class="no-js"> chatgpt-qq-chatgpt-1 | <div class="main-wrapper" role="main"> chatgpt-qq-chatgpt-1 | <div class="main-content"> chatgpt-qq-chatgpt-1 | <noscript> chatgpt-qq-chatgpt-1 | <div id="challenge-error-title"> chatgpt-qq-chatgpt-1 | <div class="h2"> chatgpt-qq-chatgpt-1 | <span class="icon-wrapper"> chatgpt-qq-chatgpt-1 | <div class="heading-icon warning-icon"></div> chatgpt-qq-chatgpt-1 | </span> chatgpt-qq-chatgpt-1 | <span id="challenge-error-text"> chatgpt-qq-chatgpt-1 | Enable JavaScript and cookies to continue chatgpt-qq-chatgpt-1 | </span> chatgpt-qq-chatgpt-1 | </div> chatgpt-qq-chatgpt-1 | </div> chatgpt-qq-chatgpt-1 | </noscript> chatgpt-qq-chatgpt-1 | <div id="trk_jschal_js" style="display:none;background-image:url('/cdn-cgi/images/trace/managed/nojs/transparent.gif?ray=7d38fdd7bacb2061')"></div> chatgpt-qq-chatgpt-1 | <form id="challenge-form" action="/api/conversation?__cf_chl_f_tk=oWiZtgRzDVkjFPXJvP5hV_BUz1aXVUksaW_OwOekhT0-1686141919-0-gaNycGzNEXs" method="POST" enctype="application/x-www-form-urlencoded"> chatgpt-qq-chatgpt-1 | <input type="hidden" name="md" value="9WQtTC0znb85AAXc5cogajMbf5numxi6.Tmh_3V7QAI-1686141919-0-AQ9NH1Ttp-9P45arBDhmMLNmKBobskryEG35VslHBMIMK5luDvDin4auk1fR5cFQo9yI-G48F_BuKEtQvfyIMY9V359Rxul0CCGRfBeLr_tsGEg-YAmm9ubuXqfmszESdng99ZohzU_Ur_Xu-QdzzcjgBdZ5sXDCzGNNvrWnVLVKq8DQceOcuPPcevyPq6Z7SnEeXEjnxQQ13ix0dc4r8etLuBetddeSNsqE0T9X_MB-Ft0E_LFxdjVu2f5ykTaNymwTMjfoK8_jrWotxK_CS-RKjWrW4vrHk81PABF2hwJ16AcjNSnKEUH28T0DTxP0Y5Hpw3XCdoW5VWQH3RDpFCTtlkDV60niWpk8BWTHV0KOe_23rj_bx_kRck7WDG337XSjvR31Qp65IJCZhQ_hlHny8FegMVgqR3jdMxdkpCTuLSRN-Xf_ThQjojguWYVMrPXXma-9cH1NYFxnWtUoM6WOVNC0G-hNXuz1B6oRdmJVRPaujzun3o59-dOJSShJOp9xNC83GRynw0G3XVgHJ-qiY7t5GS_uIwMe8wIU_x9TZeJBxdXDc6OjR3QpfAfAJlQo4_H2tnYeLYGxcizQcv7J7HDzR7n7VvbnjBUBEVhWz7_bfbRI0mI6P3Cgyv1erXciP2moBBqfOOdHx6T1GXR5dQdut9CzIIhqfQtX1SgwJfew2r7BXBLX-MRK_BEYSwttbo6vIOM2uuNgw_VvNQp-A34xHUwmabNaqbUf4MJot9BgTwKqJ3Tp_uOIcHRwqYE5nTBE7i8R4jxi04eRf9d1sbQbc1NUb3gBgy9WhnRP4V0qAbBY3mZWKwabuphAgDO-OAmiXSvynCXzeskJuNaQMASqecX8Rq1U4eq30kfiDS_1UIj9CTHXs0QQQFnvkAA_Yl52Dvll2JwtPrUYGTYOmi4kjfi44q139MuAQRjs6Uv_FL7DUotdzVRY7X_XtgK8mqpRp37n7SB3vkav-R8kX4RJcLQdymBukTW68MiC6sHKwdJazWtbVv3M-UiqR8BbzVW2KcyYbDHjV0aRV-DH9-8H5Rw8Jcl_x_CF5HP_yBaLADMA_sr93cJXdeZKMX6Te03vi2crisLXaeEOHQdIYEuVB1sI3onVBIflUNFbOOtEwQvDJEFgcKg4pmfHfy2O0I5iGcqhW_KZNeFDo2H3OfDLfHM5uUQji08cJaLtwzzUl16X1pwfpWaHywo97NG-mQBzgdf7UskY9pHOBYZ3WbXJQwB2nVDFZfdpbbEFHd_wiBLThGfeScALsQGsO0kDjn1n1hDp751fOQVTR3CmkvOpWeg7pH7AoafCBqMq0V4_gcvORFyz9AKwOBNsCxJJ7rN0rPW4uoqdiKvg3Tnu4vp9upFSoxSxGYSmAsE8l23eczUaMNFDoWAsJcBeOkxbg32hNJZEHx4WSS1DucVkL1BW0GBkmuUNhanEmo9ZT5ymHh8wBcfbM1RWK5COdbrq_-CoDCM4aowpLMHmuxjb8UarhujVflyFn0ChA3nlANMEvVBiwqJ20SpXmqr9c-1vYQqVKQX-zX2QdfhMGfSqauVBjWiIfGvvXBed47RyFfRIrU3rhS_ojXf-ZfID61sGMOZWhKR5XnYD5SmoXiyGwesBfuZAmmfB3IU1quMRaHn-NX_GMEqXZdUjsHJRATzCdzLxV6Cm868xUbKxp11HRUJzoVETgKCZ0wnLf9Q3oWDPM6Xxc3VKc5NAo8PJBkxhNCyZ19KVviRposMnBlLMiZhjoLQ2QTVNiG_lfzZdhu1K3xqSTJicSXxUzb42UlXMTji5oCeuBe9tfnzc69muPLCQ8sLA4wGXgmiJiMq1Mc2-k28XhQHKhn1VAIQGte7Pk_DxyI6Ah6-LxcNGzMIU94HV0EZcFHJSzOL1ZCRlIKRriWnPSqcxmTkl67RXPCk5SaX7ZyYa7_lhBXA9zTehuEAR8ZGzNSY21zwhNsfOTJANsZ5WhpPm90rT390cQGgOT7GRTk9wx56I1tdGEVIQeLAZAC6okUcS1YYcZI8CTW57R1UYZgyxGQtHz-vVdKezU_lwNDJMNXPe4hSCddWCHFLvnN4wGkl_Ki9A752CpNsJc7b6ff9znx2my16W17Mhf0at-I-8xTyG7Xn8Wdl3lbfOgEWX7K6TAR__GuxuG3AOOMB-AIDSC73qO5aZyqI39SVP6mt6hRr48M1cjbAyrz8LUnZFkdvJht1PktBtDi3VY__snHV9pobRzu7VthBAdj2J8RZauT6tyLtpLdyUtKy6CDzHKiQqKyMd2P5qCHqVrx8Pj10QuiYHw4dxTehmuB6lUFN6gtjEG0zLuJDQNOPIYkAy9K-B4JzxeJtMcRwv40RRohvmDgQQuw5rKpeGqc5eQPvRRLWSTZRw8lxaejqsUlyRRmSpgEp53VKo80itxn0lbNX1TsnnBMQHf-f7oBgmSibqvN2Ejc5fOv79Iu99P3EQHo5_CuMhByng-Ym9iXsLEYiruRN1Fh93vXfTmEeblSVAt1rE1UUYgw2vpHSnc__GtqU75t2SACy7QpimrhttkWIoJMV1vbSmlhWpOAkxHqSD9UXRpVjOr7sT8mf2K_bWrI8_SdtteHXvpytDKPfqNYYVxRVjdTB4t7BJeim0er8Tc91dL1z_I9iB8Cr9yuiUTyZRusfr_aHnpYDmUzTb7_pascZsyJTg0edQLd_RenZK4Hq9x_qkDzWG7WquluWtERi5QMHLR0Ketqw9GsWukjQL4T-NrLavyj2UbvietSdlmgKGA1yFPC7RYteycjNneolQAdZGFao2m5m50ue1qalmzhJWhCvpgleRfHjxgJ-kJvZC4gt6Xky8Y_On4xknzU4s2cPKwbUccU1JuqAsl4umk3bY9_2Cr5D82xRlfs0y4YwP1lymkUJyVSnYNYQE2287oYPr-0dTF-8Q4K467f8gnKjnSFTmcIh8RddQPTxjyJ7ierv5NHYyEQS4Glo1YVirv16nsbwMI9Pyj5s1WLyQ0lpBkTIg8gQxFoZGUlqnRBsaMo0lSRqEJQS_8PmDXRFkB0uK02uxMzSudbf8FRa5Uxdm9omI6kpGYs7sQq1sS0eBtxakenGYlAlQD4wqSzadFNdke3Ab5tcuEB32Yo3KTGOGheOVDCasABe8WkZED_cicd5l3WQsnTc-ekOgALpzDivViG81aayDu77ZObuI-WVcBVKp0qwbUottXzNckBOwgFqxDa_6cFL2W09a74LSb2dUB5-tX9a-Ch_TqGyq4umEfZmz613bBFFQuP0i3mNzyeCf1XgkdY0Jl966OVhXjAt9Ry79Wl3JeYO1-8Pg0d87KBR3Md_E0qpPpyGXusii5UqdTSnEQ9mlp42ty_i6UZXszPpoFIf3UoHXHOAyxRmK6nN9C7LSb-OH_EgPG1eldD_mpd0wfKFIR7nweuyj_f2A4hN1Co1PkWHUinsKPWmIvnwAEMvjnYQLNhnJJTbInDUEDv__YhwL3qoXS1MOOdxFFIsi4uxTfmHx-iZqqjjqcG3KuNEGD5l7dzdRTFLUJUmc2c7RUqUm-IXdQNVxm7Qem6IoN6RgnspLPsrUj-aGz8707Wkg--Rydji9ZUx_opC_TaGA3uyeWf3UHF4YZIRRbZbLmpFzDS1v4l9quFlSMlIfEpzveRjusNMbiH-zA49CHglHKnQRw5T-PpJ-y84C6rF-ajpYAtL2gYG0nj6O51E6_7eIQGzq7VFH0iNw-oo81dNt7ouFg4EKtpdSLZGjv-TdyYttbsGB-auXfgAgxTW_ZsMIyGIjSVIyX2_MwRDZfiLM2nLn0iu7-h7gaZTE29r26UQEI017Go_Z9QueFbrmSlFNtZ2lvaF09Zg5JjViaL3ZUgqFhlduFvdKGU4BPQ_PO4IScF_s8qvRZN06286sLMiFzmMab-5UfUKWu35DP9is9lJgNzAhthmo-qwVSeuqYiQysuj-5sYp6VSbq02CBFP_bxaNSV825lujYTVu7QFBIFaOf_3RNyp2ouyCt_2p9VpIyRoZKr_i9X748lL1-OMyeoJUB8WjRFHbZBBVBdAI0PT3MRznJ6jLDfeYQzujnHLKWuwGE8maiOz31f3PjAmjRRMBunXQK-DexflUZR4EcV1c8Ajzc2pu6NXkbUJK8IPG20cMftCPcYzOwyP8Rrgkm8czJtt28-imuCrxgtQPTnTy3SlekuxXm4CBxEh85BiUWD9RGZguWVfBeopDUNzFOm8LKbZOoX_FckLX_3y9TjpsjGaTOz_PMBT9W0t1dXTUm9BuA0HK6LAKgz7Ea8wlqpScVkxHNgCYSdhdolGBQ3zGaPZz7kK-ULZbVHnl4EdpB13PHoDAvsb2o2NxvVh-jnWMQFd36vCKhAIYeb6lw11voUQLW6_KQB1brc4UUiWjEfppp86UgL2XIZ3zGA"> chatgpt-qq-chatgpt-1 | </form> chatgpt-qq-chatgpt-1 | </div> chatgpt-qq-chatgpt-1 | </div> chatgpt-qq-chatgpt-1 | <script> chatgpt-qq-chatgpt-1 | (function(){ chatgpt-qq-chatgpt-1 | window._cf_chl_opt={ chatgpt-qq-chatgpt-1 | cvId: '2', chatgpt-qq-chatgpt-1 | cZone: 'ai.fakeopen.com', chatgpt-qq-chatgpt-1 | cType: 'managed', chatgpt-qq-chatgpt-1 | cNounce: '40701', chatgpt-qq-chatgpt-1 | cRay: '7d38fdd7bacb2061', chatgpt-qq-chatgpt-1 | cHash: '771b781d3ad02e1', chatgpt-qq-chatgpt-1 | cUPMDTk: "\/api\/conversation?__cf_chl_tk=oWiZtgRzDVkjFPXJvP5hV_BUz1aXVUksaW_OwOekhT0-1686141919-0-gaNycGzNEXs", chatgpt-qq-chatgpt-1 | cFPWv: 'g', chatgpt-qq-chatgpt-1 | cTTimeMs: '1000', chatgpt-qq-chatgpt-1 | cMTimeMs: '0', chatgpt-qq-chatgpt-1 | cTplV: 5, chatgpt-qq-chatgpt-1 | cTplB: 'cf', chatgpt-qq-chatgpt-1 | cK: "", chatgpt-qq-chatgpt-1 | cRq: { chatgpt-qq-chatgpt-1 | ru: 'aHR0cHM6Ly9haS5mYWtlb3Blbi5jb20vYXBpL2NvbnZlcnNhdGlvbg==', chatgpt-qq-chatgpt-1 | ra: 'Tk9fVUE=', chatgpt-qq-chatgpt-1 | rm: 'UE9TVA==', chatgpt-qq-chatgpt-1 | d: 'N9cY411gsphU6wTkgxGZrDb4wUcZavpy9u5ol1Lcd3IIV5aC6aF8MSCyrzTVg6clkW/b8j+lCFM7Qc6/O0k11s5sYd8Ll9Y8qtJnGJzjAOR+NHNH7pUnYkNr8OwwAseIPYSMhxp9ezfdVBBFBOYrpH06wCRY+7gLNiDFDG9MEP5o5Fbr7Pfs/R0zXigveU4RCABB9Ccpr9/PoRF64hkxtNZaHhB7x7LdgTscZXJeJmW58+OTKRgWspDZHevpo4hOUz6a4Fslyl/1RWNLLf9dxP1kCP3A4CzO5SRZPYfmAzTqusgBg1IL7CG5CSS++NdwcxdIwSMgOdkBoG/6rC7lGyCFpXfbpG2BFZ4ohPW1CrJxvuSdhGiDA0UWXRuJjl60tJkLQWEkTk2GiX2eptjVJWq0opREym80E/zrDxW2ir7vsFAkgwcnMBsVRYjFhoaphVAJs3mKqGxi+tRI8jclZn/3ZtpsMIWAcSte0o36WzzP+ILlI0TFu8ms362sjlCgOYsXNJBLTdH4xfDaIoYNvY/pkeR0/6pEP1rZcxuxS0p89/tGctxkBj6bchVPxQO0ZzgPOkRvuvgds5pZAYP89RLaXgCbhZ1AlCvlenUlPG/QW6TPS+aV3teVIZoD54SG', chatgpt-qq-chatgpt-1 | t: 'MTY4NjE0MTkxOS45NTUwMDA=', chatgpt-qq-chatgpt-1 | cT: Math.floor(Date.now() / 1000), chatgpt-qq-chatgpt-1 | m: 'FaOTpAJhpUN7OOXtuHbm2dJn35ep3IHWW/qY9TFv0a0=', chatgpt-qq-chatgpt-1 | i1: 'NqX6olShyxveKGTNqPSs/w==', chatgpt-qq-chatgpt-1 | i2: 'CNW/6H6vdkJu3xpW3Se/Jg==', chatgpt-qq-chatgpt-1 | zh: 'iUpTbzPGB1xUX0lGj9G7ryRbYQ8UKYns9XCQ8LRix3U=', chatgpt-qq-chatgpt-1 | uh: 'DV4j3Tmrbi5Rs1q3ahwVS6SgbPbI7np5884QO1u1Cgg=', chatgpt-qq-chatgpt-1 | hh: '9+bjoYIVmK/G071gDz/dEdhydbt8x0Dk7hD4bzuyVe8=', chatgpt-qq-chatgpt-1 | } chatgpt-qq-chatgpt-1 | }; chatgpt-qq-chatgpt-1 | var trkjs = document.createElement('img'); chatgpt-qq-chatgpt-1 | trkjs.setAttribute('src', '/cdn-cgi/images/trace/managed/js/transparent.gif?ray=7d38fdd7bacb2061'); chatgpt-qq-chatgpt-1 | trkjs.setAttribute('alt', ''); chatgpt-qq-chatgpt-1 | trkjs.setAttribute('style', 'display: none'); chatgpt-qq-chatgpt-1 | document.body.appendChild(trkjs); chatgpt-qq-chatgpt-1 | var cpo = document.createElement('script'); chatgpt-qq-chatgpt-1 | cpo.src = '/cdn-cgi/challenge-platform/h/g/orchestrate/managed/v1?ray=7d38fdd7bacb2061'; chatgpt-qq-chatgpt-1 | window._cf_chl_opt.cOgUHash = location.hash === '' && location.href.indexOf('#') !== -1 ? '#' : location.hash; chatgpt-qq-chatgpt-1 | window._cf_chl_opt.cOgUQuery = location.search === '' && location.href.slice(0, location.href.length - window._cf_chl_opt.cOgUHash.length).indexOf('?') !== -1 ? '?' : location.search; chatgpt-qq-chatgpt-1 | if (window.history && window.history.replaceState) { chatgpt-qq-chatgpt-1 | var ogU = location.pathname + window._cf_chl_opt.cOgUQuery + window._cf_chl_opt.cOgUHash; chatgpt-qq-chatgpt-1 | history.replaceState(null, null, "\/api\/conversation?__cf_chl_rt_tk=oWiZtgRzDVkjFPXJvP5hV_BUz1aXVUksaW_OwOekhT0-1686141919-0-gaNycGzNEXs" + window._cf_chl_opt.cOgUHash); chatgpt-qq-chatgpt-1 | cpo.onload = function() { chatgpt-qq-chatgpt-1 | history.replaceState(null, null, ogU); chatgpt-qq-chatgpt-1 | }; chatgpt-qq-chatgpt-1 | } chatgpt-qq-chatgpt-1 | document.getElementsByTagName('head')[0].appendChild(cpo); chatgpt-qq-chatgpt-1 | }()); chatgpt-qq-chatgpt-1 | </script> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | </body> chatgpt-qq-chatgpt-1 | </html> chatgpt-qq-chatgpt-1 | (code: 403) chatgpt-qq-chatgpt-1 | Traceback (most recent call last): chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/usr/local/lib/python3.11/site-packages/revChatGPT/V1.py", line 1175, in __check_response chatgpt-qq-chatgpt-1 | response.raise_for_status() chatgpt-qq-chatgpt-1 | │ └ <function Response.raise_for_status at 0x7fdebaee56c0> chatgpt-qq-chatgpt-1 | └ <Response [403 Forbidden]> chatgpt-qq-chatgpt-1 | File "/usr/local/lib/python3.11/site-packages/httpx/_models.py", line 749, in raise_for_status chatgpt-qq-chatgpt-1 | raise HTTPStatusError(message, request=request, response=self) chatgpt-qq-chatgpt-1 | │ │ │ └ <Response [403 Forbidden]> chatgpt-qq-chatgpt-1 | │ │ └ <Request('POST', 'https://chatgpt-proxy.lss233.com/api/conversation')> chatgpt-qq-chatgpt-1 | │ └ "Client error '403 Forbidden' for url 'https://chatgpt-proxy.lss233.com/api/conversation'\nFor more information check: https:... chatgpt-qq-chatgpt-1 | └ <class 'httpx.HTTPStatusError'> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | httpx.HTTPStatusError: Client error '403 Forbidden' for url 'https://chatgpt-proxy.lss233.com/api/conversation' chatgpt-qq-chatgpt-1 | For more information check: https://httpstatuses.com/403 chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | The above exception was the direct cause of the following exception: chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | Traceback (most recent call last): chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/bot.py", line 54, in <module> chatgpt-qq-chatgpt-1 | loop.run_until_complete(asyncio.gather(*bots)) chatgpt-qq-chatgpt-1 | │ │ │ │ └ [<Task pending name='Task-5' coro=<start_task() running at /app/platforms/onebot_bot.py:356> wait_for=<_GatheringFuture pendi... chatgpt-qq-chatgpt-1 | │ │ │ └ <function gather at 0x7fdebc5e4680> chatgpt-qq-chatgpt-1 | │ │ └ <module 'asyncio' from '/usr/local/lib/python3.11/asyncio/__init__.py'> chatgpt-qq-chatgpt-1 | │ └ <function BaseEventLoop.run_until_complete at 0x7fdebc5f5440> chatgpt-qq-chatgpt-1 | └ <_UnixSelectorEventLoop running=True closed=False debug=False> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/usr/local/lib/python3.11/asyncio/base_events.py", line 640, in run_until_complete chatgpt-qq-chatgpt-1 | self.run_forever() chatgpt-qq-chatgpt-1 | │ └ <function BaseEventLoop.run_forever at 0x7fdebc5f53a0> chatgpt-qq-chatgpt-1 | └ <_UnixSelectorEventLoop running=True closed=False debug=False> chatgpt-qq-chatgpt-1 | File "/usr/local/lib/python3.11/asyncio/base_events.py", line 607, in run_forever chatgpt-qq-chatgpt-1 | self._run_once() chatgpt-qq-chatgpt-1 | │ └ <function BaseEventLoop._run_once at 0x7fdebc5f71a0> chatgpt-qq-chatgpt-1 | └ <_UnixSelectorEventLoop running=True closed=False debug=False> chatgpt-qq-chatgpt-1 | File "/usr/local/lib/python3.11/asyncio/base_events.py", line 1922, in _run_once chatgpt-qq-chatgpt-1 | handle._run() chatgpt-qq-chatgpt-1 | │ └ <function Handle._run at 0x7fdebcb500e0> chatgpt-qq-chatgpt-1 | └ <Handle Task.task_wakeup(<Future finished result=None>)> chatgpt-qq-chatgpt-1 | File "/usr/local/lib/python3.11/asyncio/events.py", line 80, in _run chatgpt-qq-chatgpt-1 | self._context.run(self._callback, *self._args) chatgpt-qq-chatgpt-1 | │ │ │ │ │ └ <member '_args' of 'Handle' objects> chatgpt-qq-chatgpt-1 | │ │ │ │ └ <Handle Task.task_wakeup(<Future finished result=None>)> chatgpt-qq-chatgpt-1 | │ │ │ └ <member '_callback' of 'Handle' objects> chatgpt-qq-chatgpt-1 | │ │ └ <Handle Task.task_wakeup(<Future finished result=None>)> chatgpt-qq-chatgpt-1 | │ └ <member '_context' of 'Handle' objects> chatgpt-qq-chatgpt-1 | └ <Handle Task.task_wakeup(<Future finished result=None>)> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/platforms/onebot_bot.py", line 151, in _ chatgpt-qq-chatgpt-1 | await handle_message( chatgpt-qq-chatgpt-1 | └ <function handle_message at 0x7fde7c8df9c0> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | > File "/app/universal.py", line 269, in handle_message chatgpt-qq-chatgpt-1 | await action(session_id, message.strip(), conversation_context, respond) chatgpt-qq-chatgpt-1 | │ │ │ │ │ └ <function handle_message.<locals>.respond at 0x7fde7c99e0c0> chatgpt-qq-chatgpt-1 | │ │ │ │ └ None chatgpt-qq-chatgpt-1 | │ │ │ └ <method 'strip' of 'str' objects> chatgpt-qq-chatgpt-1 | │ │ └ '你好' chatgpt-qq-chatgpt-1 | │ └ 'friend-271234004' chatgpt-qq-chatgpt-1 | └ <function handle_message.<locals>.wrap_request.<locals>.call at 0x7fde7c99e3e0> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/universal.py", line 53, in call chatgpt-qq-chatgpt-1 | await m.handle_request(session_id, message, respond, conversation_context, n) chatgpt-qq-chatgpt-1 | │ │ │ │ │ │ └ <function handle_message.<locals>.wrap_request.<locals>.call at 0x7fde7c99e340> chatgpt-qq-chatgpt-1 | │ │ │ │ │ └ None chatgpt-qq-chatgpt-1 | │ │ │ │ └ <function handle_message.<locals>.respond at 0x7fde7c99e0c0> chatgpt-qq-chatgpt-1 | │ │ │ └ '你好' chatgpt-qq-chatgpt-1 | │ │ └ 'friend-271234004' chatgpt-qq-chatgpt-1 | │ └ <function MiddlewareConcurrentLock.handle_request at 0x7fde7c8dfb00> chatgpt-qq-chatgpt-1 | └ <middlewares.concurrentlock.MiddlewareConcurrentLock object at 0x7fde7d368bd0> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/middlewares/concurrentlock.py", line 43, in handle_request chatgpt-qq-chatgpt-1 | await action(session_id, prompt, conversation_context, respond) chatgpt-qq-chatgpt-1 | │ │ │ │ └ <function handle_message.<locals>.respond at 0x7fde7c99e0c0> chatgpt-qq-chatgpt-1 | │ │ │ └ None chatgpt-qq-chatgpt-1 | │ │ └ '你好' chatgpt-qq-chatgpt-1 | │ └ 'friend-271234004' chatgpt-qq-chatgpt-1 | └ <function handle_message.<locals>.wrap_request.<locals>.call at 0x7fde7c99e340> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/universal.py", line 53, in call chatgpt-qq-chatgpt-1 | await m.handle_request(session_id, message, respond, conversation_context, n) chatgpt-qq-chatgpt-1 | │ │ │ │ │ │ └ <function handle_message.<locals>.wrap_request.<locals>.call at 0x7fde7c99e2a0> chatgpt-qq-chatgpt-1 | │ │ │ │ │ └ None chatgpt-qq-chatgpt-1 | │ │ │ │ └ <function handle_message.<locals>.respond at 0x7fde7c99e0c0> chatgpt-qq-chatgpt-1 | │ │ │ └ '你好' chatgpt-qq-chatgpt-1 | │ │ └ 'friend-271234004' chatgpt-qq-chatgpt-1 | │ └ <function Middleware.handle_request at 0x7fde7c8deca0> chatgpt-qq-chatgpt-1 | └ <middlewares.baiducloud.MiddlewareBaiduCloud object at 0x7fde7c98d250> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/middlewares/middleware.py", line 9, in handle_request chatgpt-qq-chatgpt-1 | await action(session_id, prompt, conversation_context, respond) chatgpt-qq-chatgpt-1 | │ │ │ │ └ <function handle_message.<locals>.respond at 0x7fde7c99e0c0> chatgpt-qq-chatgpt-1 | │ │ │ └ None chatgpt-qq-chatgpt-1 | │ │ └ '你好' chatgpt-qq-chatgpt-1 | │ └ 'friend-271234004' chatgpt-qq-chatgpt-1 | └ <function handle_message.<locals>.wrap_request.<locals>.call at 0x7fde7c99e2a0> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/universal.py", line 53, in call chatgpt-qq-chatgpt-1 | await m.handle_request(session_id, message, respond, conversation_context, n) chatgpt-qq-chatgpt-1 | │ │ │ │ │ │ └ <function handle_message.<locals>.wrap_request.<locals>.call at 0x7fde7c99e200> chatgpt-qq-chatgpt-1 | │ │ │ │ │ └ None chatgpt-qq-chatgpt-1 | │ │ │ │ └ <function handle_message.<locals>.respond at 0x7fde7c99e0c0> chatgpt-qq-chatgpt-1 | │ │ │ └ '你好' chatgpt-qq-chatgpt-1 | │ │ └ 'friend-271234004' chatgpt-qq-chatgpt-1 | │ └ <function MiddlewareRatelimit.handle_request at 0x7fde7c8dec00> chatgpt-qq-chatgpt-1 | └ <middlewares.ratelimit.MiddlewareRatelimit object at 0x7fde8c3107d0> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/middlewares/ratelimit.py", line 23, in handle_request chatgpt-qq-chatgpt-1 | await action(session_id, prompt, conversation_context, respond) chatgpt-qq-chatgpt-1 | │ │ │ │ └ <function handle_message.<locals>.respond at 0x7fde7c99e0c0> chatgpt-qq-chatgpt-1 | │ │ │ └ None chatgpt-qq-chatgpt-1 | │ │ └ '你好' chatgpt-qq-chatgpt-1 | │ └ 'friend-271234004' chatgpt-qq-chatgpt-1 | └ <function handle_message.<locals>.wrap_request.<locals>.call at 0x7fde7c99e200> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/universal.py", line 53, in call chatgpt-qq-chatgpt-1 | await m.handle_request(session_id, message, respond, conversation_context, n) chatgpt-qq-chatgpt-1 | │ │ │ │ │ │ └ <function handle_message.<locals>.request at 0x7fde7c99e160> chatgpt-qq-chatgpt-1 | │ │ │ │ │ └ None chatgpt-qq-chatgpt-1 | │ │ │ │ └ <function handle_message.<locals>.respond at 0x7fde7c99e0c0> chatgpt-qq-chatgpt-1 | │ │ │ └ '你好' chatgpt-qq-chatgpt-1 | │ │ └ 'friend-271234004' chatgpt-qq-chatgpt-1 | │ └ <function MiddlewareTimeout.handle_request at 0x7fde7c8dfba0> chatgpt-qq-chatgpt-1 | └ <middlewares.timeout.MiddlewareTimeout object at 0x7fde7d235490> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/middlewares/timeout.py", line 27, in handle_request chatgpt-qq-chatgpt-1 | await asyncio.wait_for(coro_task, config.response.max_timeout) chatgpt-qq-chatgpt-1 | │ │ │ │ │ └ 600.0 chatgpt-qq-chatgpt-1 | │ │ │ │ └ Response(mode='mixed', buffer_delay=0.0, default_ai='chatgpt-web', error_format='出错了', err... chatgpt-qq-chatgpt-1 | │ │ │ └ Config(onebot=Onebot(manager_qq=271234004, reverse_ws_host='0.0.0.0', reverse_ws_port=8554), mirai=None, telegram=None, disco... chatgpt-qq-chatgpt-1 | │ │ └ <Task finished name='Task-28' coro=<handle_message.<locals>.request() done, defined at /app/universal.py:107> exception=OpenA... chatgpt-qq-chatgpt-1 | │ └ <function wait_for at 0x7fdebc5e40e0> chatgpt-qq-chatgpt-1 | └ <module 'asyncio' from '/usr/local/lib/python3.11/asyncio/__init__.py'> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/usr/local/lib/python3.11/asyncio/tasks.py", line 479, in wait_for chatgpt-qq-chatgpt-1 | return fut.result() chatgpt-qq-chatgpt-1 | │ └ <method 'result' of '_asyncio.Task' objects> chatgpt-qq-chatgpt-1 | └ <Task finished name='Task-28' coro=<handle_message.<locals>.request() done, defined at /app/universal.py:107> exception=OpenA... chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/universal.py", line 222, in request chatgpt-qq-chatgpt-1 | async for rendered in task: chatgpt-qq-chatgpt-1 | └ <async_generator object retry.<locals>.decorator.<locals>.wrapper at 0x7fde7c8ec3d0> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/utils/retry.py", line 21, in wrapper chatgpt-qq-chatgpt-1 | async for result in func(*args, **kwargs): chatgpt-qq-chatgpt-1 | │ │ └ {'prompt': '你好', 'chain': MessageChain([Plain(text='你好')]), 'name': '1'} chatgpt-qq-chatgpt-1 | │ └ (<conversation.ConversationContext object at 0x7fde7d248990>,) chatgpt-qq-chatgpt-1 | └ <function ConversationContext.ask at 0x7fde7c8de480> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/conversation.py", line 192, in ask chatgpt-qq-chatgpt-1 | async for item in self.adapter.ask(prompt): chatgpt-qq-chatgpt-1 | │ │ │ └ '你好' chatgpt-qq-chatgpt-1 | │ │ └ <function ChatGPTWebAdapter.ask at 0x7fde7cbf4fe0> chatgpt-qq-chatgpt-1 | │ └ <adapter.chatgpt.web.ChatGPTWebAdapter object at 0x7fde7c9be110> chatgpt-qq-chatgpt-1 | └ <conversation.ConversationContext object at 0x7fde7d248990> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/adapter/chatgpt/web.py", line 114, in ask chatgpt-qq-chatgpt-1 | raise e chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/adapter/chatgpt/web.py", line 78, in ask chatgpt-qq-chatgpt-1 | async for resp in self.bot.ask(prompt, self.conversation_id, self.parent_id, model=self.current_model): chatgpt-qq-chatgpt-1 | │ │ │ │ │ │ │ │ │ └ 'text-davinci-002-render-sha' chatgpt-qq-chatgpt-1 | │ │ │ │ │ │ │ │ └ <adapter.chatgpt.web.ChatGPTWebAdapter object at 0x7fde7c9be110> chatgpt-qq-chatgpt-1 | │ │ │ │ │ │ │ └ None chatgpt-qq-chatgpt-1 | │ │ │ │ │ │ └ <adapter.chatgpt.web.ChatGPTWebAdapter object at 0x7fde7c9be110> chatgpt-qq-chatgpt-1 | │ │ │ │ │ └ None chatgpt-qq-chatgpt-1 | │ │ │ │ └ <adapter.chatgpt.web.ChatGPTWebAdapter object at 0x7fde7c9be110> chatgpt-qq-chatgpt-1 | │ │ │ └ '你好' chatgpt-qq-chatgpt-1 | │ │ └ <function ChatGPTBrowserChatbot.ask at 0x7fde8c672fc0> chatgpt-qq-chatgpt-1 | │ └ <chatbot.chatgpt.ChatGPTBrowserChatbot object at 0x7fdeba757490 [unlocked]> chatgpt-qq-chatgpt-1 | └ <adapter.chatgpt.web.ChatGPTWebAdapter object at 0x7fde7c9be110> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/app/chatbot/chatgpt.py", line 57, in ask chatgpt-qq-chatgpt-1 | async for r in self.bot.ask(prompt=prompt, conversation_id=conversation_id, parent_id=parent_id): chatgpt-qq-chatgpt-1 | │ │ │ │ │ └ None chatgpt-qq-chatgpt-1 | │ │ │ │ └ None chatgpt-qq-chatgpt-1 | │ │ │ └ '你好' chatgpt-qq-chatgpt-1 | │ │ └ <function AsyncChatbot.ask at 0x7fde8c65d940> chatgpt-qq-chatgpt-1 | │ └ <revChatGPT.V1.AsyncChatbot object at 0x7fde8c6bfe10> chatgpt-qq-chatgpt-1 | └ <chatbot.chatgpt.ChatGPTBrowserChatbot object at 0x7fdeba757490 [unlocked]> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | File "/usr/local/lib/python3.11/site-packages/revChatGPT/V1.py", line 1012, in ask chatgpt-qq-chatgpt-1 | async for msg in self.post_messages( chatgpt-qq-chatgpt-1 | │ └ <function AsyncChatbot.post_messages at 0x7fde8c65d8a0> chatgpt-qq-chatgpt-1 | └ <revChatGPT.V1.AsyncChatbot object at 0x7fde8c6bfe10> chatgpt-qq-chatgpt-1 | File "/usr/local/lib/python3.11/site-packages/revChatGPT/V1.py", line 965, in post_messages chatgpt-qq-chatgpt-1 | async for msg in self.__send_request( chatgpt-qq-chatgpt-1 | └ <revChatGPT.V1.AsyncChatbot object at 0x7fde8c6bfe10> chatgpt-qq-chatgpt-1 | File "/usr/local/lib/python3.11/site-packages/revChatGPT/V1.py", line 853, in __send_request chatgpt-qq-chatgpt-1 | await self.__check_response(response) chatgpt-qq-chatgpt-1 | │ └ <Response [403 Forbidden]> chatgpt-qq-chatgpt-1 | └ <revChatGPT.V1.AsyncChatbot object at 0x7fde8c6bfe10> chatgpt-qq-chatgpt-1 | File "/usr/local/lib/python3.11/site-packages/revChatGPT/V1.py", line 1183, in __check_response chatgpt-qq-chatgpt-1 | raise error from e chatgpt-qq-chatgpt-1 | └ OpenAI: <!DOCTYPE html> chatgpt-qq-chatgpt-1 | <html lang="en-US"> chatgpt-qq-chatgpt-1 | <head> chatgpt-qq-chatgpt-1 | <title>Just a moment...</title> chatgpt-qq-chatgpt-1 | <meta http-equiv="Content-Type" co... chatgpt-qq-chatgpt-1 | revChatGPT.typings.Error: OpenAI: <!DOCTYPE html> chatgpt-qq-chatgpt-1 | <html lang="en-US"> chatgpt-qq-chatgpt-1 | <head> chatgpt-qq-chatgpt-1 | <title>Just a moment...</title> chatgpt-qq-chatgpt-1 | <meta http-equiv="Content-Type" content="text/html; charset=UTF-8"> chatgpt-qq-chatgpt-1 | <meta http-equiv="X-UA-Compatible" content="IE=Edge"> chatgpt-qq-chatgpt-1 | <meta name="robots" content="noindex,nofollow"> chatgpt-qq-chatgpt-1 | <meta name="viewport" content="width=device-width,initial-scale=1"> chatgpt-qq-chatgpt-1 | <link href="/cdn-cgi/styles/challenges.css" rel="stylesheet"> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | </head> chatgpt-qq-chatgpt-1 | <body class="no-js"> chatgpt-qq-chatgpt-1 | <div class="main-wrapper" role="main"> chatgpt-qq-chatgpt-1 | <div class="main-content"> chatgpt-qq-chatgpt-1 | <noscript> chatgpt-qq-chatgpt-1 | <div id="challenge-error-title"> chatgpt-qq-chatgpt-1 | <div class="h2"> chatgpt-qq-chatgpt-1 | <span class="icon-wrapper"> chatgpt-qq-chatgpt-1 | <div class="heading-icon warning-icon"></div> chatgpt-qq-chatgpt-1 | </span> chatgpt-qq-chatgpt-1 | <span id="challenge-error-text"> chatgpt-qq-chatgpt-1 | Enable JavaScript and cookies to continue chatgpt-qq-chatgpt-1 | </span> chatgpt-qq-chatgpt-1 | </div> chatgpt-qq-chatgpt-1 | </div> chatgpt-qq-chatgpt-1 | </noscript> chatgpt-qq-chatgpt-1 | <div id="trk_jschal_js" style="display:none;background-image:url('/cdn-cgi/images/trace/managed/nojs/transparent.gif?ray=7d38fdd7bacb2061')"></div> chatgpt-qq-chatgpt-1 | <form id="challenge-form" action="/api/conversation?__cf_chl_f_tk=oWiZtgRzDVkjFPXJvP5hV_BUz1aXVUksaW_OwOekhT0-1686141919-0-gaNycGzNEXs" method="POST" enctype="application/x-www-form-urlencoded"> chatgpt-qq-chatgpt-1 | <input type="hidden" name="md" value="9WQtTC0znb85AAXc5cogajMbf5numxi6.Tmh_3V7QAI-1686141919-0-AQ9NH1Ttp-9P45arBDhmMLNmKBobskryEG35VslHBMIMK5luDvDin4auk1fR5cFQo9yI-G48F_BuKEtQvfyIMY9V359Rxul0CCGRfBeLr_tsGEg-YAmm9ubuXqfmszESdng99ZohzU_Ur_Xu-QdzzcjgBdZ5sXDCzGNNvrWnVLVKq8DQceOcuPPcevyPq6Z7SnEeXEjnxQQ13ix0dc4r8etLuBetddeSNsqE0T9X_MB-Ft0E_LFxdjVu2f5ykTaNymwTMjfoK8_jrWotxK_CS-RKjWrW4vrHk81PABF2hwJ16AcjNSnKEUH28T0DTxP0Y5Hpw3XCdoW5VWQH3RDpFCTtlkDV60niWpk8BWTHV0KOe_23rj_bx_kRck7WDG337XSjvR31Qp65IJCZhQ_hlHny8FegMVgqR3jdMxdkpCTuLSRN-Xf_ThQjojguWYVMrPXXma-9cH1NYFxnWtUoM6WOVNC0G-hNXuz1B6oRdmJVRPaujzun3o59-dOJSShJOp9xNC83GRynw0G3XVgHJ-qiY7t5GS_uIwMe8wIU_x9TZeJBxdXDc6OjR3QpfAfAJlQo4_H2tnYeLYGxcizQcv7J7HDzR7n7VvbnjBUBEVhWz7_bfbRI0mI6P3Cgyv1erXciP2moBBqfOOdHx6T1GXR5dQdut9CzIIhqfQtX1SgwJfew2r7BXBLX-MRK_BEYSwttbo6vIOM2uuNgw_VvNQp-A34xHUwmabNaqbUf4MJot9BgTwKqJ3Tp_uOIcHRwqYE5nTBE7i8R4jxi04eRf9d1sbQbc1NUb3gBgy9WhnRP4V0qAbBY3mZWKwabuphAgDO-OAmiXSvynCXzeskJuNaQMASqecX8Rq1U4eq30kfiDS_1UIj9CTHXs0QQQFnvkAA_Yl52Dvll2JwtPrUYGTYOmi4kjfi44q139MuAQRjs6Uv_FL7DUotdzVRY7X_XtgK8mqpRp37n7SB3vkav-R8kX4RJcLQdymBukTW68MiC6sHKwdJazWtbVv3M-UiqR8BbzVW2KcyYbDHjV0aRV-DH9-8H5Rw8Jcl_x_CF5HP_yBaLADMA_sr93cJXdeZKMX6Te03vi2crisLXaeEOHQdIYEuVB1sI3onVBIflUNFbOOtEwQvDJEFgcKg4pmfHfy2O0I5iGcqhW_KZNeFDo2H3OfDLfHM5uUQji08cJaLtwzzUl16X1pwfpWaHywo97NG-mQBzgdf7UskY9pHOBYZ3WbXJQwB2nVDFZfdpbbEFHd_wiBLThGfeScALsQGsO0kDjn1n1hDp751fOQVTR3CmkvOpWeg7pH7AoafCBqMq0V4_gcvORFyz9AKwOBNsCxJJ7rN0rPW4uoqdiKvg3Tnu4vp9upFSoxSxGYSmAsE8l23eczUaMNFDoWAsJcBeOkxbg32hNJZEHx4WSS1DucVkL1BW0GBkmuUNhanEmo9ZT5ymHh8wBcfbM1RWK5COdbrq_-CoDCM4aowpLMHmuxjb8UarhujVflyFn0ChA3nlANMEvVBiwqJ20SpXmqr9c-1vYQqVKQX-zX2QdfhMGfSqauVBjWiIfGvvXBed47RyFfRIrU3rhS_ojXf-ZfID61sGMOZWhKR5XnYD5SmoXiyGwesBfuZAmmfB3IU1quMRaHn-NX_GMEqXZdUjsHJRATzCdzLxV6Cm868xUbKxp11HRUJzoVETgKCZ0wnLf9Q3oWDPM6Xxc3VKc5NAo8PJBkxhNCyZ19KVviRposMnBlLMiZhjoLQ2QTVNiG_lfzZdhu1K3xqSTJicSXxUzb42UlXMTji5oCeuBe9tfnzc69muPLCQ8sLA4wGXgmiJiMq1Mc2-k28XhQHKhn1VAIQGte7Pk_DxyI6Ah6-LxcNGzMIU94HV0EZcFHJSzOL1ZCRlIKRriWnPSqcxmTkl67RXPCk5SaX7ZyYa7_lhBXA9zTehuEAR8ZGzNSY21zwhNsfOTJANsZ5WhpPm90rT390cQGgOT7GRTk9wx56I1tdGEVIQeLAZAC6okUcS1YYcZI8CTW57R1UYZgyxGQtHz-vVdKezU_lwNDJMNXPe4hSCddWCHFLvnN4wGkl_Ki9A752CpNsJc7b6ff9znx2my16W17Mhf0at-I-8xTyG7Xn8Wdl3lbfOgEWX7K6TAR__GuxuG3AOOMB-AIDSC73qO5aZyqI39SVP6mt6hRr48M1cjbAyrz8LUnZFkdvJht1PktBtDi3VY__snHV9pobRzu7VthBAdj2J8RZauT6tyLtpLdyUtKy6CDzHKiQqKyMd2P5qCHqVrx8Pj10QuiYHw4dxTehmuB6lUFN6gtjEG0zLuJDQNOPIYkAy9K-B4JzxeJtMcRwv40RRohvmDgQQuw5rKpeGqc5eQPvRRLWSTZRw8lxaejqsUlyRRmSpgEp53VKo80itxn0lbNX1TsnnBMQHf-f7oBgmSibqvN2Ejc5fOv79Iu99P3EQHo5_CuMhByng-Ym9iXsLEYiruRN1Fh93vXfTmEeblSVAt1rE1UUYgw2vpHSnc__GtqU75t2SACy7QpimrhttkWIoJMV1vbSmlhWpOAkxHqSD9UXRpVjOr7sT8mf2K_bWrI8_SdtteHXvpytDKPfqNYYVxRVjdTB4t7BJeim0er8Tc91dL1z_I9iB8Cr9yuiUTyZRusfr_aHnpYDmUzTb7_pascZsyJTg0edQLd_RenZK4Hq9x_qkDzWG7WquluWtERi5QMHLR0Ketqw9GsWukjQL4T-NrLavyj2UbvietSdlmgKGA1yFPC7RYteycjNneolQAdZGFao2m5m50ue1qalmzhJWhCvpgleRfHjxgJ-kJvZC4gt6Xky8Y_On4xknzU4s2cPKwbUccU1JuqAsl4umk3bY9_2Cr5D82xRlfs0y4YwP1lymkUJyVSnYNYQE2287oYPr-0dTF-8Q4K467f8gnKjnSFTmcIh8RddQPTxjyJ7ierv5NHYyEQS4Glo1YVirv16nsbwMI9Pyj5s1WLyQ0lpBkTIg8gQxFoZGUlqnRBsaMo0lSRqEJQS_8PmDXRFkB0uK02uxMzSudbf8FRa5Uxdm9omI6kpGYs7sQq1sS0eBtxakenGYlAlQD4wqSzadFNdke3Ab5tcuEB32Yo3KTGOGheOVDCasABe8WkZED_cicd5l3WQsnTc-ekOgALpzDivViG81aayDu77ZObuI-WVcBVKp0qwbUottXzNckBOwgFqxDa_6cFL2W09a74LSb2dUB5-tX9a-Ch_TqGyq4umEfZmz613bBFFQuP0i3mNzyeCf1XgkdY0Jl966OVhXjAt9Ry79Wl3JeYO1-8Pg0d87KBR3Md_E0qpPpyGXusii5UqdTSnEQ9mlp42ty_i6UZXszPpoFIf3UoHXHOAyxRmK6nN9C7LSb-OH_EgPG1eldD_mpd0wfKFIR7nweuyj_f2A4hN1Co1PkWHUinsKPWmIvnwAEMvjnYQLNhnJJTbInDUEDv__YhwL3qoXS1MOOdxFFIsi4uxTfmHx-iZqqjjqcG3KuNEGD5l7dzdRTFLUJUmc2c7RUqUm-IXdQNVxm7Qem6IoN6RgnspLPsrUj-aGz8707Wkg--Rydji9ZUx_opC_TaGA3uyeWf3UHF4YZIRRbZbLmpFzDS1v4l9quFlSMlIfEpzveRjusNMbiH-zA49CHglHKnQRw5T-PpJ-y84C6rF-ajpYAtL2gYG0nj6O51E6_7eIQGzq7VFH0iNw-oo81dNt7ouFg4EKtpdSLZGjv-TdyYttbsGB-auXfgAgxTW_ZsMIyGIjSVIyX2_MwRDZfiLM2nLn0iu7-h7gaZTE29r26UQEI017Go_Z9QueFbrmSlFNtZ2lvaF09Zg5JjViaL3ZUgqFhlduFvdKGU4BPQ_PO4IScF_s8qvRZN06286sLMiFzmMab-5UfUKWu35DP9is9lJgNzAhthmo-qwVSeuqYiQysuj-5sYp6VSbq02CBFP_bxaNSV825lujYTVu7QFBIFaOf_3RNyp2ouyCt_2p9VpIyRoZKr_i9X748lL1-OMyeoJUB8WjRFHbZBBVBdAI0PT3MRznJ6jLDfeYQzujnHLKWuwGE8maiOz31f3PjAmjRRMBunXQK-DexflUZR4EcV1c8Ajzc2pu6NXkbUJK8IPG20cMftCPcYzOwyP8Rrgkm8czJtt28-imuCrxgtQPTnTy3SlekuxXm4CBxEh85BiUWD9RGZguWVfBeopDUNzFOm8LKbZOoX_FckLX_3y9TjpsjGaTOz_PMBT9W0t1dXTUm9BuA0HK6LAKgz7Ea8wlqpScVkxHNgCYSdhdolGBQ3zGaPZz7kK-ULZbVHnl4EdpB13PHoDAvsb2o2NxvVh-jnWMQFd36vCKhAIYeb6lw11voUQLW6_KQB1brc4UUiWjEfppp86UgL2XIZ3zGA"> chatgpt-qq-chatgpt-1 | </form> chatgpt-qq-chatgpt-1 | </div> chatgpt-qq-chatgpt-1 | </div> chatgpt-qq-chatgpt-1 | <script> chatgpt-qq-chatgpt-1 | (function(){ chatgpt-qq-chatgpt-1 | window._cf_chl_opt={ chatgpt-qq-chatgpt-1 | cvId: '2', chatgpt-qq-chatgpt-1 | cZone: 'ai.fakeopen.com', chatgpt-qq-chatgpt-1 | cType: 'managed', chatgpt-qq-chatgpt-1 | cNounce: '40701', chatgpt-qq-chatgpt-1 | cRay: '7d38fdd7bacb2061', chatgpt-qq-chatgpt-1 | cHash: '771b781d3ad02e1', chatgpt-qq-chatgpt-1 | cUPMDTk: "\/api\/conversation?__cf_chl_tk=oWiZtgRzDVkjFPXJvP5hV_BUz1aXVUksaW_OwOekhT0-1686141919-0-gaNycGzNEXs", chatgpt-qq-chatgpt-1 | cFPWv: 'g', chatgpt-qq-chatgpt-1 | cTTimeMs: '1000', chatgpt-qq-chatgpt-1 | cMTimeMs: '0', chatgpt-qq-chatgpt-1 | cTplV: 5, chatgpt-qq-chatgpt-1 | cTplB: 'cf', chatgpt-qq-chatgpt-1 | cK: "", chatgpt-qq-chatgpt-1 | cRq: { chatgpt-qq-chatgpt-1 | ru: 'aHR0cHM6Ly9haS5mYWtlb3Blbi5jb20vYXBpL2NvbnZlcnNhdGlvbg==', chatgpt-qq-chatgpt-1 | ra: 'Tk9fVUE=', chatgpt-qq-chatgpt-1 | rm: 'UE9TVA==', chatgpt-qq-chatgpt-1 | d: 'N9cY411gsphU6wTkgxGZrDb4wUcZavpy9u5ol1Lcd3IIV5aC6aF8MSCyrzTVg6clkW/b8j+lCFM7Qc6/O0k11s5sYd8Ll9Y8qtJnGJzjAOR+NHNH7pUnYkNr8OwwAseIPYSMhxp9ezfdVBBFBOYrpH06wCRY+7gLNiDFDG9MEP5o5Fbr7Pfs/R0zXigveU4RCABB9Ccpr9/PoRF64hkxtNZaHhB7x7LdgTscZXJeJmW58+OTKRgWspDZHevpo4hOUz6a4Fslyl/1RWNLLf9dxP1kCP3A4CzO5SRZPYfmAzTqusgBg1IL7CG5CSS++NdwcxdIwSMgOdkBoG/6rC7lGyCFpXfbpG2BFZ4ohPW1CrJxvuSdhGiDA0UWXRuJjl60tJkLQWEkTk2GiX2eptjVJWq0opREym80E/zrDxW2ir7vsFAkgwcnMBsVRYjFhoaphVAJs3mKqGxi+tRI8jclZn/3ZtpsMIWAcSte0o36WzzP+ILlI0TFu8ms362sjlCgOYsXNJBLTdH4xfDaIoYNvY/pkeR0/6pEP1rZcxuxS0p89/tGctxkBj6bchVPxQO0ZzgPOkRvuvgds5pZAYP89RLaXgCbhZ1AlCvlenUlPG/QW6TPS+aV3teVIZoD54SG', chatgpt-qq-chatgpt-1 | t: 'MTY4NjE0MTkxOS45NTUwMDA=', chatgpt-qq-chatgpt-1 | cT: Math.floor(Date.now() / 1000), chatgpt-qq-chatgpt-1 | m: 'FaOTpAJhpUN7OOXtuHbm2dJn35ep3IHWW/qY9TFv0a0=', chatgpt-qq-chatgpt-1 | i1: 'NqX6olShyxveKGTNqPSs/w==', chatgpt-qq-chatgpt-1 | i2: 'CNW/6H6vdkJu3xpW3Se/Jg==', chatgpt-qq-chatgpt-1 | zh: 'iUpTbzPGB1xUX0lGj9G7ryRbYQ8UKYns9XCQ8LRix3U=', chatgpt-qq-chatgpt-1 | uh: 'DV4j3Tmrbi5Rs1q3ahwVS6SgbPbI7np5884QO1u1Cgg=', chatgpt-qq-chatgpt-1 | hh: '9+bjoYIVmK/G071gDz/dEdhydbt8x0Dk7hD4bzuyVe8=', chatgpt-qq-chatgpt-1 | } chatgpt-qq-chatgpt-1 | }; chatgpt-qq-chatgpt-1 | var trkjs = document.createElement('img'); chatgpt-qq-chatgpt-1 | trkjs.setAttribute('src', '/cdn-cgi/images/trace/managed/js/transparent.gif?ray=7d38fdd7bacb2061'); chatgpt-qq-chatgpt-1 | trkjs.setAttribute('alt', ''); chatgpt-qq-chatgpt-1 | trkjs.setAttribute('style', 'display: none'); chatgpt-qq-chatgpt-1 | document.body.appendChild(trkjs); chatgpt-qq-chatgpt-1 | var cpo = document.createElement('script'); chatgpt-qq-chatgpt-1 | cpo.src = '/cdn-cgi/challenge-platform/h/g/orchestrate/managed/v1?ray=7d38fdd7bacb2061'; chatgpt-qq-chatgpt-1 | window._cf_chl_opt.cOgUHash = location.hash === '' && location.href.indexOf('#') !== -1 ? '#' : location.hash; chatgpt-qq-chatgpt-1 | window._cf_chl_opt.cOgUQuery = location.search === '' && location.href.slice(0, location.href.length - window._cf_chl_opt.cOgUHash.length).indexOf('?') !== -1 ? '?' : location.search; chatgpt-qq-chatgpt-1 | if (window.history && window.history.replaceState) { chatgpt-qq-chatgpt-1 | var ogU = location.pathname + window._cf_chl_opt.cOgUQuery + window._cf_chl_opt.cOgUHash; chatgpt-qq-chatgpt-1 | history.replaceState(null, null, "\/api\/conversation?__cf_chl_rt_tk=oWiZtgRzDVkjFPXJvP5hV_BUz1aXVUksaW_OwOekhT0-1686141919-0-gaNycGzNEXs" + window._cf_chl_opt.cOgUHash); chatgpt-qq-chatgpt-1 | cpo.onload = function() { chatgpt-qq-chatgpt-1 | history.replaceState(null, null, ogU); chatgpt-qq-chatgpt-1 | }; chatgpt-qq-chatgpt-1 | } chatgpt-qq-chatgpt-1 | document.getElementsByTagName('head')[0].appendChild(cpo); chatgpt-qq-chatgpt-1 | }()); chatgpt-qq-chatgpt-1 | </script> chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | </body> chatgpt-qq-chatgpt-1 | </html> chatgpt-qq-chatgpt-1 | (code: 403) chatgpt-qq-chatgpt-1 | Please check that the input is correct, or you can resolve this issue by filing an issue chatgpt-qq-chatgpt-1 | chatgpt-qq-chatgpt-1 | Project URL: https://github.com/acheong08/ChatGPT

其他内容

whhh233 commented 1 year ago

同问

Nothingness-Void commented 1 year ago

accesstoken过期了,重新填一遍就好了,最近过期特别快,不知道为啥 我昨天刚新填的今天就过期了,可能是同时多账号登录的原因? 我直接登录一个账号是正常的 然后关掉同时登录两个就有一个登录不进去了 我把第一个重新填了一遍就俩都进不去了 删掉其中一个账号重新填一个账号就能正常进了 如果同时登录了多个账号,删成只有一个试试

whhh233 commented 1 year ago

我就登了一个账号,今天刚获取的token,QQ号也是新到

whhhhh @.***

---Original--- From: @.> Date: Wed, Jun 7, 2023 21:50 PM To: @.>; Cc: @.**@.>; Subject: Re: [lss233/chatgpt-mirai-qq-bot] [BUG] OpenAI: <!DOCTYPE html>(Issue #929)

accesstoken过期了,重新填一遍就好了,最近过期特别快,不知道为啥 我昨天刚新填的今天就过期了,可能是同时多账号登录的原因? 我直接登录一个账号是正常的 然后关掉同时登录两个就有一个登录不进去了 我把第一个重新填了一遍就俩都进不去了 删掉其中一个账号重新填一个账号就能正常进了 如果同时登录了多个账号,删成只有一个试试

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

Nothingness-Void commented 1 year ago

我就登了一个账号,今天刚获取的token,QQ号也是新到 whhhhh @. ---Original--- From: @.> Date: Wed, Jun 7, 2023 21:50 PM To: @.>; Cc: @*.**@*.>; Subject: Re: [lss233/chatgpt-mirai-qq-bot] [BUG] OpenAI: <!DOCTYPE html>(Issue #929) accesstoken过期了,重新填一遍就好了,最近过期特别快,不知道为啥 我昨天刚新填的今天就过期了,可能是同时多账号登录的原因? 我直接登录一个账号是正常的 然后关掉同时登录两个就有一个登录不进去了 我把第一个重新填了一遍就俩都进不去了 删掉其中一个账号重新填一个账号就能正常进了 如果同时登录了多个账号,删成只有一个试试 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.>

我试了试,我账号刚进去没问题,5分钟左右accesstoken就失效了 可能是OpenAI发力了?

pasdy1 commented 1 year ago

我就登了一个账号,今天刚获取的token,QQ号也是新到 whhhhh @. ---Original--- From: @.**> Date: Wed, Jun 7, 2023 21:50 PM To: @.**>; Cc: @.**@.**>; Subject: Re: [lss233/chatgpt-mirai-qq-bot] [BUG] OpenAI: <!DOCTYPE html>(Issue #929) accesstoken过期了,重新填一遍就好了,最近过期特别快,不知道为啥 我昨天刚新填的今天就过期了,可能是同时多账号登录的原因? 我直接登录一个账号是正常的 然后关掉同时登录两个就有一个登录不进去了 我把第一个重新填了一遍就俩都进不去了 删掉其中一个账号重新填一个账号就能正常进了 如果同时登录了多个账号,删成只有一个试试 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.>

我试了试,我账号刚进去没问题,5分钟左右accesstoken就失效了 可能是OpenAI发力了?

好像是账号的问题哈;通过网页登陆进去是空白的

pasdy1 commented 1 year ago

图片

Nothingness-Void commented 1 year ago

我就登了一个账号,今天刚获取的token,QQ号也是新到 whhhhh @. ---Original--- From: @.**> Date: Wed, Jun 7, 2023 21:50 PM To: @.**>; Cc: @.**@.**>; Subject: Re: [lss233/chatgpt-mirai-qq-bot] [BUG] OpenAI: (Issue #929) accesstoken过期了,重新填一遍就好了,最近过期特别快,不知道为啥 我昨天刚新填的今天就过期了,可能是同时多账号登录的原因? 我直接登录一个账号是正常的 然后关掉同时登录两个就有一个登录不进去了 我把第一个重新填了一遍就俩都进不去了 删掉其中一个账号重新填一个账号就能正常进了 如果同时登录了多个账号,删成只有一个试试 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.>

我试了试,我账号刚进去没问题,5分钟左右accesstoken就失效了 可能是OpenAI发力了?

好像是账号的问题哈;通过网页登陆进去是空白的

问题是我这账号进去是能用的

Nothingness-Void commented 1 year ago

我就登了一个账号,今天刚获取的token,QQ号也是新到 whhhhh @. ---Original--- From: @.**> Date: Wed, Jun 7, 2023 21:50 PM To: @.**>; Cc: @.**@.**>; Subject: Re: [lss233/chatgpt-mirai-qq-bot] [BUG] OpenAI: (Issue #929) accesstoken过期了,重新填一遍就好了,最近过期特别快,不知道为啥 我昨天刚新填的今天就过期了,可能是同时多账号登录的原因? 我直接登录一个账号是正常的 然后关掉同时登录两个就有一个登录不进去了 我把第一个重新填了一遍就俩都进不去了 删掉其中一个账号重新填一个账号就能正常进了 如果同时登录了多个账号,删成只有一个试试 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.>

我试了试,我账号刚进去没问题,5分钟左右accesstoken就失效了 可能是OpenAI发力了?

好像是账号的问题哈;通过网页登陆进去是空白的

问题是我这账号进去是能用的

image

Nothingness-Void commented 1 year ago

2023-06-07 22:43:17.638 | ERROR | universal:handle_message:295 - OpenAI: <!DOCTYPE html>

Attention Required! | Cloudflare

Sorry, you have been blocked

You are unable to access fakeopen.com

Why have I been blocked?

This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data.

What can I do to resolve this?

You can email the site owner to let them know you were blocked. Please include what you were doing when this page came up and the Cloudflare Ray ID found at the bottom of this page.

(code: 403) Traceback (most recent call last):

File "C:\Users\NothingnessVoid\Desktop\Windows-quickstart-go-cqhttp-refs.tags.v2.5.1\python3.11\Lib\site-packages\revChatGPT\V1.py", line 1175, in __check_response response.raise_for_status() │ └ <function Response.raise_for_status at 0x00000173DC999080> └ <Response [403 Forbidden]> File "C:\Users\NothingnessVoid\Desktop\Windows-quickstart-go-cqhttp-refs.tags.v2.5.1\python3.11\Lib\site-packages\httpx_models.py", line 749, in raise_for_status raise HTTPStatusError(message, request=request, response=self) │ │ │ └ <Response [403 Forbidden]> │ │ └ <Request('PATCH', 'https://chatgpt-proxy.lss233.com/api/conversation/34e1a060-0572-4f4e-b7cb-4057cc7bdb3e')> │ └ "Client error '403 Forbidden' for url 'https://chatgpt-proxy.lss233.com/api/conversation/34e1a060-0572-4f4e-b7cb-4057cc7bdb3e... └ <class 'httpx.HTTPStatusError'>

httpx.HTTPStatusError: Client error '403 Forbidden' for url 'https://chatgpt-proxy.lss233.com/api/conversation/34e1a060-0572-4f4e-b7cb-4057cc7bdb3e' For more information check: https://httpstatuses.com/403 好像是验证问题?

pasdy1 commented 1 year ago

2023-06-07 22:43:17.638 | ERROR | universal:handle_message:295 - OpenAI:

Attention Required! | Cloudflare Please enable cookies. # Sorry, you have been blocked ## You are unable to access fakeopen.com ```

Why have I been blocked?

This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data.

What can I do to resolve this?

You can email the site owner to let them know you were blocked. Please include what you were doing when this page came up and the Cloudflare Ray ID found at the bottom of this page.

``` (code: 403) Traceback (most recent call last): File "C:\Users\NothingnessVoid\Desktop\Windows-quickstart-go-cqhttp-refs.tags.v2.5.1\python3.11\Lib\site-packages\revChatGPT\V1.py", line 1175, in __check_response response.raise_for_status() │ └ File "C:\Users\NothingnessVoid\Desktop\Windows-quickstart-go-cqhttp-refs.tags.v2.5.1\python3.11\Lib\site-packages\httpx_models.py", line 749, in raise_for_status raise HTTPStatusError(message, request=request, response=self) │ │ │ └ │ │ └ ](https://chatgpt-proxy.lss233.com/api/conversation/34e1a060-0572-4f4e-b7cb-4057cc7bdb3e')%3E) │ └ "Client error '403 Forbidden' for url 'https://chatgpt-proxy.lss233.com/api/conversation/34e1a060-0572-4f4e-b7cb-4057cc7bdb3e... └ httpx.HTTPStatusError: Client error '403 Forbidden' for url 'https://chatgpt-proxy.lss233.com/api/conversation/34e1a060-0572-4f4e-b7cb-4057cc7bdb3e' For more information check: https://httpstatuses.com/403 好像是验证问题?

会是消息发的太多导致的?刚才试了一下第一句和第二句话可以,第三句开始就报错了。

Nothingness-Void commented 1 year ago

我解决了 请前往cloudflare 创建一个workers 代码输入

export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    url.host = 'ai.fakeopen.com';
    return fetch(new Request(url, request))
  }
}

然后在触发器中添加域名 比如 fakeopen.xxx.com 然后你自定义接入点

[openai]
# 网页版 ChatGPT 接入点,欢迎在交流群中分享你的接入点
browserless_endpoint = "https://fakeopen.xxx.com/api/"

即可解决 估计是最近用的人太多了 接入点就开始检测了

pasdy1 commented 1 year ago

我解决了 请前往cloudflare 创建一个workers 代码输入

export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    url.host = 'ai.fakeopen.com';
    return fetch(new Request(url, request))
  }
}

然后在触发器中添加域名 比如 fakeopen.xxx.com 然后你自定义接入点

[openai]
# 网页版 ChatGPT 接入点,欢迎在交流群中分享你的接入点
browserless_endpoint = "https://fakeopen.xxx.com/api/"

即可解决 估计是最近用的人太多了 接入点就开始检测了

好的大佬 明天试一下~

pasdy1 commented 1 year ago

我解决了 请前往cloudflare 创建一个workers 代码输入

export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    url.host = 'ai.fakeopen.com';
    return fetch(new Request(url, request))
  }
}

然后在触发器中添加域名 比如 fakeopen.xxx.com 然后你自定义接入点

[openai]
# 网页版 ChatGPT 接入点,欢迎在交流群中分享你的接入点
browserless_endpoint = "https://fakeopen.xxx.com/api/"

即可解决 估计是最近用的人太多了 接入点就开始检测了

大佬,Freenom的域名好像比较难弄, 您是通过办法拿到的域名?好像也不能注册目前

whhh233 commented 1 year ago

我解决了 请前往cloudflare 创建一个workers 代码输入

export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    url.host = 'ai.fakeopen.com';
    return fetch(new Request(url, request))
  }
}

然后在触发器中添加域名 比如 fakeopen.xxx.com 然后你自定义接入点

[openai]
# 网页版 ChatGPT 接入点,欢迎在交流群中分享你的接入点
browserless_endpoint = "https://fakeopen.xxx.com/api/"

即可解决 估计是最近用的人太多了 接入点就开始检测了

大佬,Freenom的域名好像比较难弄, 您是通过办法拿到的域名?好像也不能注册目前

随便拿个域名应该都行吧

Nothingness-Void commented 1 year ago

我解决了 请前往cloudflare 创建一个workers 代码输入

export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    url.host = 'ai.fakeopen.com';
    return fetch(new Request(url, request))
  }
}

然后在触发器中添加域名 比如 fakeopen.xxx.com 然后你自定义接入点

[openai]
# 网页版 ChatGPT 接入点,欢迎在交流群中分享你的接入点
browserless_endpoint = "https://fakeopen.xxx.com/api/"

即可解决 估计是最近用的人太多了 接入点就开始检测了

大佬,Freenom的域名好像比较难弄, 您是通过办法拿到的域名?好像也不能注册目前

随便拿个域名就行 不行你用我的https://fakeopen.nothingnessvoid.tech/

whhh233 commented 1 year ago

我解决了 请前往cloudflare 创建一个workers 代码输入

export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    url.host = 'ai.fakeopen.com';
    return fetch(new Request(url, request))
  }
}

然后在触发器中添加域名 比如 fakeopen.xxx.com 然后你自定义接入点

[openai]
# 网页版 ChatGPT 接入点,欢迎在交流群中分享你的接入点
browserless_endpoint = "https://fakeopen.xxx.com/api/"

即可解决 估计是最近用的人太多了 接入点就开始检测了

大佬,Freenom的域名好像比较难弄, 您是通过办法拿到的域名?好像也不能注册目前

随便拿个域名就行 不行你用我的https://fakeopen.nothingnessvoid.tech/

现在网页端登录好像都不太行了,我自己搭的接入点也登不了

image image
Ewall555 commented 1 year ago

我解决了 请前往cloudflare 创建一个workers 代码输入

export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    url.host = 'ai.fakeopen.com';
    return fetch(new Request(url, request))
  }
}

然后在触发器中添加域名 比如 fakeopen.xxx.com 然后你自定义接入点

[openai]
# 网页版 ChatGPT 接入点,欢迎在交流群中分享你的接入点
browserless_endpoint = "https://fakeopen.xxx.com/api/"

即可解决 估计是最近用的人太多了 接入点就开始检测了

大佬,Freenom的域名好像比较难弄, 您是通过办法拿到的域名?好像也不能注册目前

cloudflare的workers不是会分给一个二级域名吗,直接用那个后面加上/api/就可以了

whhh233 commented 1 year ago

我解决了 请前往cloudflare 创建一个workers 代码输入

export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    url.host = 'ai.fakeopen.com';
    return fetch(new Request(url, request))
  }
}

然后在触发器中添加域名 比如 fakeopen.xxx.com 然后你自定义接入点

[openai]
# 网页版 ChatGPT 接入点,欢迎在交流群中分享你的接入点
browserless_endpoint = "https://fakeopen.xxx.com/api/"

即可解决 估计是最近用的人太多了 接入点就开始检测了

大佬,Freenom的域名好像比较难弄, 您是通过办法拿到的域名?好像也不能注册目前

cloudflare的workers不是会分给一个二级域名吗,直接用那个后面加上/api/就可以了

这个还是走的ai.fakeopen.com接入点,问题就是接入点现在用不了,自建的也不行

Nothingness-Void commented 1 year ago

我解决了 请前往cloudflare 创建一个workers 代码输入

export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    url.host = 'ai.fakeopen.com';
    return fetch(new Request(url, request))
  }
}

然后在触发器中添加域名 比如 fakeopen.xxx.com 然后你自定义接入点

[openai]
# 网页版 ChatGPT 接入点,欢迎在交流群中分享你的接入点
browserless_endpoint = "https://fakeopen.xxx.com/api/"

即可解决 估计是最近用的人太多了 接入点就开始检测了

大佬,Freenom的域名好像比较难弄, 您是通过办法拿到的域名?好像也不能注册目前

cloudflare的workers不是会分给一个二级域名吗,直接用那个后面加上/api/就可以了

这个还是走的ai.fakeopen.com接入点,问题就是接入点现在用不了,自建的也不行

我worker走完是能用的 或者试试自建呢

whhh233 commented 1 year ago

尝试过自建了,还是不行

whhhhh @.***

---Original--- From: @.> Date: Thu, Jun 8, 2023 13:31 PM To: @.>; Cc: @.**@.>; Subject: Re: [lss233/chatgpt-mirai-qq-bot] [BUG] OpenAI: <!DOCTYPE html>(Issue #929)

我解决了 请前往cloudflare 创建一个workers 代码输入 export default { async fetch(request, env) { const url = new URL(request.url); url.host = 'ai.fakeopen.com'; return fetch(new Request(url, request)) } }
然后在触发器中添加域名 比如 fakeopen.xxx.com 然后你自定义接入点 [openai] # 网页版 ChatGPT 接入点,欢迎在交流群中分享你的接入点 browserless_endpoint = "https://fakeopen.xxx.com/api/"
即可解决 估计是最近用的人太多了 接入点就开始检测了

大佬,Freenom的域名好像比较难弄, 您是通过办法拿到的域名?好像也不能注册目前

cloudflare的workers不是会分给一个二级域名吗,直接用那个后面加上/api/就可以了

这个还是走的ai.fakeopen.com接入点,问题就是接入点现在用不了,自建的也不行

我worker走完是能用的 或者试试自建呢

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

Nothingness-Void commented 1 year ago

你自建完的域名直接访问结果是什么啊NothingnessVoid发件人: @. @.> 代表 whhhhh @.>发送时间: 2023年6月8日星期四 下午1:41收件人: chatgpt-mirai-qq-bot @.>抄送: 1744914510 @.>; comment @.>主题: Re: [lss233/chatgpt-mirai-qq-bot] [BUG] OpenAI: <!DOCTYPE html>(Issue #929) 尝试过自建了,还是不行

whhhhh

@.***

---Original---

From: @.***>

Date: Thu, Jun 8, 2023 13:31 PM

To: @.***>;

Cc: @.**@.>;

Subject: Re: [lss233/chatgpt-mirai-qq-bot] [BUG] OpenAI: <!DOCTYPE html>(Issue #929)

我解决了 请前往cloudflare 创建一个workers 代码输入

export default { async fetch(request, env) { const url = new URL(request.url); url.host = 'ai.fakeopen.com'; return fetch(new Request(url, request)) } }

然后在触发器中添加域名 比如 fakeopen.xxx.com 然后你自定义接入点

[openai] # 网页版 ChatGPT 接入点,欢迎在交流群中分享你的接入点 browserless_endpoint = "https://fakeopen.xxx.com/api/"

即可解决 估计是最近用的人太多了 接入点就开始检测了

大佬,Freenom的域名好像比较难弄, 您是通过办法拿到的域名?好像也不能注册目前

cloudflare的workers不是会分给一个二级域名吗,直接用那个后面加上/api/就可以了

这个还是走的ai.fakeopen.com接入点,问题就是接入点现在用不了,自建的也不行

我worker走完是能用的 或者试试自建呢

Reply to this email directly, view it on GitHub, or unsubscribe.

You are receiving this because you commented.Message ID: @.***>

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you commented.Message ID: @.***>

lss233 commented 1 year ago

已恢复。

shadowbox-cash commented 1 year ago

我解决了 请前往cloudflare 创建一个workers 代码输入

export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    url.host = 'ai.fakeopen.com';
    return fetch(new Request(url, request))
  }
}

然后在触发器中添加域名 比如 fakeopen.xxx.com 然后你自定义接入点

[openai]
# 网页版 ChatGPT 接入点,欢迎在交流群中分享你的接入点
browserless_endpoint = "https://fakeopen.xxx.com/api/"

即可解决 估计是最近用的人太多了 接入点就开始检测了

我自己在workers建立了URL: ttps://chatgpt-fakeopen.jackb0506.workers.dev/ browserless_endpoint: Optional[str] = "https://chatgpt-fakeopen.jackb0506.workers.dev/api/"

还是报错,显示

Sorry, you have been blocked

You are unable to access fakeopen.com

用原来默认的https://chatgpt-proxy.lss233.com/api/ 就显示OpenAI: Internal Server Error (code: 500)

应该怎么做?

Nothingness-Void commented 1 year ago

我解决了 请前往cloudflare 创建一个workers 代码输入

export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    url.host = 'ai.fakeopen.com';
    return fetch(new Request(url, request))
  }
}

然后在触发器中添加域名 比如 fakeopen.xxx.com 然后你自定义接入点

[openai]
# 网页版 ChatGPT 接入点,欢迎在交流群中分享你的接入点
browserless_endpoint = "https://fakeopen.xxx.com/api/"

即可解决 估计是最近用的人太多了 接入点就开始检测了

我自己在workers建立了URL: ttps://chatgpt-fakeopen.jackb0506.workers.dev/ browserless_endpoint: Optional[str] = "https://chatgpt-fakeopen.jackb0506.workers.dev/api/"

还是报错,显示

Sorry, you have been blocked

You are unable to access fakeopen.com

用原来默认的https://chatgpt-proxy.lss233.com/api/ 就显示OpenAI: Internal Server Error (code: 500)

应该怎么做?

我给你一个我的接入点 http://[sgp.nothingnessvoid.tech:8080/chatgpt/](http://sgp.nothingnessvoid.tech:8080/chatgpt/)