binary-husky / gpt_academic

为GPT/GLM等LLM大语言模型提供实用化交互接口,特别优化论文阅读/润色/写作体验,模块化设计,支持自定义快捷按钮&函数插件,支持Python和C++等项目剖析&自译解功能,PDF/LaTex论文翻译&总结功能,支持并行问询多种LLM模型,支持chatglm3等本地模型。接入通义千问, deepseekcoder, 讯飞星火, 文心一言, llama2, rwkv, claude2, moss等。
https://github.com/binary-husky/gpt_academic/wiki/online
GNU General Public License v3.0
63.51k stars 7.87k forks source link

使用azure的openai 和自定义域名运行之后报错 #713

Closed namezzy closed 7 months ago

namezzy commented 1 year ago

:\Users\Dongxiao.Wang\gpt_academic>python main.py [PROXY] 网络代理状态:已配置。配置信息如下: {'http': 'http://127.0.0.1:7890', 'https': 'http://127.0.0.1:7890'} [API_KEY] 本项目现已支持OpenAI和API2D的api-key。也支持同时填写多个api-key,如API_KEY="openai-key1,openai-key2,api2d-key3" [API_KEY] 您既可以在config.py中修改api-key(s),也可以在问题输入区输入临时的api-key(s),然后回车键提交后即可生效。 [API_KEY] 您的 API_KEY 是: f8a4d472cad9497*** API_KEY 导入成功 None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. 所有问询记录将自动保存在本地目录./gpt_log/chat_secrets.log, 请注意自我隐私保护哦! 查询代理的地理位置,返回的结果是{'ip': '13.215.221.137', 'network': '13.215.128.0/17', 'version': 'IPv4', 'city': 'Singapore', 'region': None, 'region_code': None, 'country': 'SG', 'country_name': 'Singapore', 'country_code': 'SG', 'country_code_iso3': 'SGP', 'country_capital': 'Singapore', 'country_tld': '.sg', 'continent_code': 'AS', 'in_eu': False, 'postal': '189559', 'latitude': 1.295861, 'longitude': 103.852085, 'timezone': 'Asia/Singapore', 'utc_offset': '+0800', 'country_calling_code': '+65', 'currency': 'SGD', 'currency_name': 'Dollar', 'languages': 'cmn,en-SG,ms-SG,ta-SG,zh-SG', 'country_area': 692.7, 'country_population': 5638676, 'asn': 'AS16509', 'org': 'AMAZON-02'} 代理配置 http://127.0.0.1:7890, 代理所在地:Singapore 如果浏览器没有自动打开,请复制并转到以下URL: (亮色主题): http://localhost:59237 (暗色主题): http://localhost:59237/?__theme=dark 正在执行一些模块的预热... 正在加载tokenizer,如果是第一次运行,可能需要一点时间下载参数 加载tokenizer完毕 正在加载tokenizer,如果是第一次运行,可能需要一点时间下载参数 加载tokenizer完毕 Running on local URL: http://0.0.0.0:59237

To create a public link, set share=True in launch(). gpt-3.5-turbo : 0 : 1 .......... Traceback (most recent call last): File "C:\Users\Dongxiao.Wang\gpt_academic\request_llm\bridge_chatgpt.py", line 171, in predict chunk = next(stream_response) ^^^^^^^^^^^^^^^^^^^^^ StopIteration

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\Users\Dongxiao.Wang\AppData\Local\Programs\Python\Python311\Lib\site-packages\gradio\routes.py", line 412, in run_predict output = await app.get_blocks().process_api( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Dongxiao.Wang\AppData\Local\Programs\Python\Python311\Lib\site-packages\gradio\blocks.py", line 1299, in process_api result = await self.call_function( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Dongxiao.Wang\AppData\Local\Programs\Python\Python311\Lib\site-packages\gradio\blocks.py", line 1035, in call_function prediction = await anyio.to_thread.run_sync( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Dongxiao.Wang\AppData\Local\Programs\Python\Python311\Lib\site-packages\anyio\to_thread.py", line 31, in run_sync return await get_asynclib().run_sync_in_worker_thread( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Dongxiao.Wang\AppData\Local\Programs\Python\Python311\Lib\site-packages\anyio_backends_asyncio.py", line 937, in run_sync_in_worker_thread return await future ^^^^^^^^^^^^ File "C:\Users\Dongxiao.Wang\AppData\Local\Programs\Python\Python311\Lib\site-packages\anyio_backends_asyncio.py", line 867, in run result = context.run(func, args) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Dongxiao.Wang\AppData\Local\Programs\Python\Python311\Lib\site-packages\gradio\utils.py", line 491, in async_iteration return next(iterator) ^^^^^^^^^^^^^^ File "C:\Users\Dongxiao.Wang\gpt_academic\toolbox.py", line 62, in decorated yield from f(txt_passon, llm_kwargs, plugin_kwargs, chatbot_with_cookie, history, system_prompt, args) File "C:\Users\Dongxiao.Wang\gpt_academic\request_llm\bridge_all.py", line 296, in predict yield from method(inputs, llm_kwargs, *args, **kwargs) RuntimeError: generator raised StopIteration

binary-husky commented 1 year ago

这种情况是有报错,但是报错接口和openai不一样,导致没显示出来

https://github.com/binary-husky/gpt_academic/blob/1134ec2df53a7a573ad4ffc45f5975dab0b7bad2/request_llm/bridge_chatgpt.py#LL171C1-L172C1