binary-husky / gpt_academic

为GPT/GLM等LLM大语言模型提供实用化交互接口,特别优化论文阅读/润色/写作体验,模块化设计,支持自定义快捷按钮&函数插件,支持Python和C++等项目剖析&自译解功能,PDF/LaTex论文翻译&总结功能,支持并行问询多种LLM模型,支持chatglm3等本地模型。接入通义千问, deepseekcoder, 讯飞星火, 文心一言, llama2, rwkv, claude2, moss等。
https://github.com/binary-husky/gpt_academic/wiki/online
GNU General Public License v3.0
64.45k stars 7.97k forks source link

[Bug]: 3.73版本海外用户使用Claude3报错 #1627

Open GitHub-Canary opened 6 months ago

GitHub-Canary commented 6 months ago

Installation Method | 安装方法与平台

Pip Install (I used latest requirements.txt)

Version | 版本

Latest | 最新版

OS | 操作系统

Windows

Describe the bug | 简述

本人海外用户,没使用代理。配置了Claude3和OpenAI的API,均已付费使用,openAI的API可以正常使用,Claude3的不行,提示报错:[Local Message] Request timeout. Network error. Please check proxy settings in config.py.网络错误,检查代理服务器是否可用,以及代理设置的格式是否正确,格式须是[协议]://[地址]:[端口],缺一不可。

Screen Shot | 有帮助的截图

微信截图_20240314105503 微信截图_20240314105807

Terminal Traceback & Material to Help Reproduce Bugs | 终端traceback(如有) + 帮助我们复现的测试材料样本(如有)

No response

xyjandyyk commented 6 months ago

我代理配置好了,调用claude3也是跟你一样的错误

Yujie-G commented 6 months ago

同样的问题。查看输出端的报错是不是这个?可能是api-key被封了 image

GitHub-Canary commented 6 months ago

微信截图_20240314222329 报错是这样,跟你的不一样

Yujie-G commented 6 months ago

微信截图_20240314222329 报错是这样,跟你的不一样

claude3的支持仅在frontier分支,似乎 https://github.com/binary-husky/gpt_academic/issues/1613#issuecomment-1987152989

xyjandyyk commented 6 months ago

微信截图_20240314222329 报错是这样,跟你的不一样

我报错跟你一样

binaryYuki commented 6 months ago
Screenshot 2024-03-15 at 21 24 36

@Yujie-G 我刚刚测试了一下都是正常的 你要不再试试? 目前我能测试到的就是

Menghuan1918 commented 6 months ago

https://github.com/binary-husky/gpt_academic/issues/1627#issuecomment-1997572761 这个报错是因为anthropic包的版本太老了,试试pip install anthropic --upgrade

hjnnjh commented 6 months ago

我也是,弄了好久了,而且container内部都是能curl claude api接口的 image

xyjandyyk commented 6 months ago

#1627 (comment) 这个报错是因为anthropic包的版本太老了,试试pip install anthropic --upgrade

像题主和我这种错误是什么原因呢

hjnnjh commented 6 months ago

更奇怪的是我在我的MacBook上部署的容器一点问题都没有,正常使用claude 3,一到云服务器上就不行了。。。

Menghuan1918 commented 6 months ago

#1627 (comment) 这个报错是因为 anthropic包的版本太老了,试试 pip install anthropic --upgrade

像题主和我这种错误是什么原因呢

抱歉,我刚刚回复错人了,AttributeError: 'Anthropic' object has no attribute 'messages’的报错应该就是包版本太老了(可能自动更新的时候没升级成功?我也不太清楚)

至少我这边0.18.1版本是能正常用的

hjnnjh commented 6 months ago

更奇怪的是我在我的MacBook上部署的容器一点问题都没有,正常使用claude 3,一到云服务器上就不行了。。。

同一个API-key

hjnnjh commented 6 months ago

再补充一点细节,我在container内部curl的时候clash日志是有记录的,走的是我设置的流量转发规则,但通过公网IP:Port访问gpt academic页面提交请求的时候,并没有记录,报错是403,并且我确定API没填错,作者参考下。 image

xyjandyyk commented 6 months ago

#1627 (comment) 这个报错是因为 anthropic包的版本太老了,试试 pip install anthropic --upgrade

像题主和我这种错误是什么原因呢

抱歉,我刚刚回复错人了,AttributeError: 'Anthropic' object has no attribute 'messages’的报错应该就是包版本太老了(可能自动更新的时候没升级成功?我也不太清楚)

至少我这边0.18.1版本是能正常用的

`Traceback (most recent call last): File "C:\OneKeyInstallerForWindowsAndMacOS\gpt_academic\request_llms\bridge_claude.py", line 164, in predict stream = anthropic.messages.create( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic_utils_utils.py", line 275, in wrapper return func(*args, **kwargs) File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic\resources\messages.py", line 678, in create return self._post( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic_base_client.py", line 1208, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic_base_client.py", line 897, in request return self._request( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic_base_client.py", line 988, in _request raise self._make_status_error_from_response(err.response) from None anthropic.PermissionDeniedError: Error code: 403 - {'error': {'type': 'forbidden', 'message': 'Request not allowed'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\routes.py", line 422, in run_predict output = await app.get_blocks().process_api( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\blocks.py", line 1323, in process_api result = await self.call_function( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\blocks.py", line 1067, in call_function prediction = await utils.async_iteration(iterator) File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\utils.py", line 336, in async_iteration return await iterator.anext() File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\utils.py", line 329, in anext return await anyio.to_thread.run_sync( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anyio\to_thread.py", line 56, in run_sync return await get_async_backend().run_sync_in_worker_thread( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 2144, in run_sync_in_worker_thread return await future File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 851, in run result = context.run(func, args) File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\utils.py", line 312, in run_sync_iterator_async return next(iterator) File "C:\OneKeyInstallerForWindowsAndMacOS\gpt_academic\toolbox.py", line 122, in decorated yield from f(txt_passon, llm_kwargs, plugin_kwargs, chatbot_with_cookie, history, system_prompt, args) File "C:\OneKeyInstallerForWindowsAndMacOS\gpt_academic\request_llms\bridge_all.py", line 870, in predict yield from method(inputs, llm_kwargs, *args, **kwargs) File "C:\OneKeyInstallerForWindowsAndMacOS\gpt_academic\request_llms\bridge_claude.py", line 178, in predict if retry > MAX_RETRY: raise TimeoutError`

我又仔细看了下返回的是403。代理设置应该是没问题的,弹出的webui能检测到代理地址是US
hjnnjh commented 6 months ago

Screenshot 2024-03-15 at 21 24 36 @Yujie-G 我刚刚测试了一下都是正常的 你要不再试试? 目前我能测试到的就是 - 401 APIKEY 错误 - 403 账户没有绑定手机号码

请问你是本地部署还是云服务器部署啊,不知道是不是docker版本的问题额,MacBook上是docker-desktop,ubuntu服务器上是docker-engine。

hjnnjh commented 6 months ago

#1627 (comment) 这个报错是因为 anthropic包的版本太老了,试试 pip install anthropic --upgrade

像题主和我这种错误是什么原因呢

抱歉,我刚刚回复错人了,AttributeError: 'Anthropic' object has no attribute 'messages’的报错应该就是包版本太老了(可能自动更新的时候没升级成功?我也不太清楚) 至少我这边0.18.1版本是能正常用的

`Traceback (most recent call last): File "C:\OneKeyInstallerForWindowsAndMacOS\gpt_academic\request_llms\bridge_claude.py", line 164, in predict stream = anthropic.messages.create( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic_utils_utils.py", line 275, in wrapper return func(*args, **kwargs) File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic\resources\messages.py", line 678, in create return self._post( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic_base_client.py", line 1208, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic_base_client.py", line 897, in request return self._request( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic_base_client.py", line 988, in _request raise self._make_status_error_from_response(err.response) from None anthropic.PermissionDeniedError: Error code: 403 - {'error': {'type': 'forbidden', 'message': 'Request not allowed'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\routes.py", line 422, in run_predict output = await app.get_blocks().process_api( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\blocks.py", line 1323, in process_api result = await self.call_function( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\blocks.py", line 1067, in call_function prediction = await utils.async_iteration(iterator) File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\utils.py", line 336, in async_iteration return await iterator.anext() File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\utils.py", line 329, in anext return await anyio.to_thread.run_sync( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anyio\to_thread.py", line 56, in run_sync return await get_async_backend().run_sync_in_worker_thread( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 2144, in run_sync_in_worker_thread return await future File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 851, in run result = context.run(func, args) File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\utils.py", line 312, in run_sync_iterator_async return next(iterator) File "C:\OneKeyInstallerForWindowsAndMacOS\gpt_academic\toolbox.py", line 122, in decorated yield from f(txt_passon, llm_kwargs, plugin_kwargs, chatbot_with_cookie, history, system_prompt, args) File "C:\OneKeyInstallerForWindowsAndMacOS\gpt_academic\request_llms\bridge_all.py", line 870, in predict yield from method(inputs, llm_kwargs, *args, **kwargs) File "C:\OneKeyInstallerForWindowsAndMacOS\gpt_academic\request_llms\bridge_claude.py", line 178, in predict if retry > MAX_RETRY: raise TimeoutError`

我又仔细看了下返回的是403。代理设置应该是没问题的,弹出的webui能检测到代理地址是US

我现在怀疑是docker的问题

xyjandyyk commented 6 months ago

#1627 (comment) 这个报错是因为 anthropic包的版本太老了,试试 pip install anthropic --upgrade

像题主和我这种错误是什么原因呢

抱歉,我刚刚回复错人了,AttributeError: 'Anthropic' object has no attribute 'messages’的报错应该就是包版本太老了(可能自动更新的时候没升级成功?我也不太清楚) 至少我这边0.18.1版本是能正常用的

Traceback (most recent call last): File "C:\OneKeyInstallerForWindowsAndMacOS\gpt_academic\request_llms\bridge_claude.py", line 164, in predict stream = anthropic.messages.create( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic_utils_utils.py", line 275, in wrapper return func(*args, **kwargs) File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic\resources\messages.py", line 678, in create return self._post( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic_base_client.py", line 1208, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic_base_client.py", line 897, in request return self._request( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anthropic_base_client.py", line 988, in _request raise self._make_status_error_from_response(err.response) from None anthropic.PermissionDeniedError: Error code: 403 - {'error': {'type': 'forbidden', 'message': 'Request not allowed'}} During handling of the above exception, another exception occurred: Traceback (most recent call last): File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\routes.py", line 422, in run_predict output = await app.get_blocks().process_api( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\blocks.py", line 1323, in process_api result = await self.call_function( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\blocks.py", line 1067, in call_function prediction = await utils.async_iteration(iterator) File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\utils.py", line 336, in async_iteration return await iterator.**anext**() File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\utils.py", line 329, in **anext** return await anyio.to_thread.run_sync( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anyio\to_thread.py", line 56, in run_sync return await get_async_backend().run_sync_in_worker_thread( File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 2144, in run_sync_in_worker_thread return await future File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 851, in run result = context.run(func, *args) File "C:\OneKeyInstallerForWindowsAndMacOS\installer_files\env\lib\site-packages\gradio\utils.py", line 312, in run_sync_iterator_async return next(iterator) File "C:\OneKeyInstallerForWindowsAndMacOS\gpt_academic\toolbox.py", line 122, in decorated yield from f(txt_passon, llm_kwargs, plugin_kwargs, chatbot_with_cookie, history, system_prompt, *args) File "C:\OneKeyInstallerForWindowsAndMacOS\gpt_academic\request_llms\bridge_all.py", line 870, in predict yield from method(inputs, llm_kwargs, *args, **kwargs) File "C:\OneKeyInstallerForWindowsAndMacOS\gpt_academic\request_llms\bridge_claude.py", line 178, in predict if retry > MAX_RETRY: raise TimeoutError

我又仔细看了下返回的是403。代理设置应该是没问题的,弹出的webui能检测到代理地址是US

我现在怀疑是docker的问题

我是在我的win笔记本上用一键包运行的

hjnnjh commented 6 months ago

再补充一点细节,我在container内部curl的时候clash日志是有记录的,走的是我设置的流量转发规则,但通过公网IP:Port访问gpt academic页面提交请求的时候,并没有记录,报错是403,并且我确定API没填错,作者参考下。 image

有没有用clash premium tun试过的?感觉也有可能是clash的问题,这个可能有用?

Menghuan1918 commented 6 months ago

https://github.com/binary-husky/gpt_academic/issues/1627#issuecomment-2001911149 我刚刚试了一下用一键包运行,似乎没有问题(选的是2连接Github下载/1: 使用pypi官方) 实在不行重新部署一下? 图片

GitHub-Canary commented 6 months ago

我的问题通过pip install anthropic --upgrade解决了,感谢楼上大佬的分享

binaryYuki commented 6 months ago

再补充一点细节,我在container内部curl的时候clash日志是有记录的,走的是我设置的流量转发规则,但通过公网IP:Port访问gpt academic页面提交请求的时候,并没有记录,报错是403,并且我确定API没填错,作者参考下。 image

有没有用clash premium tun试过的?感觉也有可能是clash的问题,这个可能有用?

是否有可能是 魔法的 ip 不太干净?类似之前 openai 封 ip 能否尝试一下更换节点

hjnnjh commented 6 months ago

再补充一点细节,我在container内部curl的时候clash日志是有记录的,走的是我设置的流量转发规则,但通过公网IP:Port访问gpt academic页面提交请求的时候,并没有记录,报错是403,并且我确定API没填错,作者参考下。 image

有没有用clash premium tun试过的?感觉也有可能是clash的问题,这个可能有用?

是否有可能是 魔法的 ip 不太干净?类似之前 openai 封 ip 能否尝试一下更换节点

不是,同一个节点在我的MacBook上一切正常,看我新提的issue #1636 ,request的流量都没走clash,感觉像是linux下clash的问题或者程序的问题,但我也不清楚,弄不好我也不折腾了,现在服务器上用国内第三方的gpt 4,claude挺贵的我也不打算长期用。

xyjandyyk commented 6 months ago

再补充一点细节,我在container内部curl的时候clash日志是有记录的,走的是我设置的流量转发规则,但通过公网IP:Port访问gpt academic页面提交请求的时候,并没有记录,报错是403,并且我确定API没填错,作者参考下。 image

有没有用clash premium tun试过的?感觉也有可能是clash的问题,这个可能有用?

是否有可能是 魔法的 ip 不太干净?类似之前 openai 封 ip 能否尝试一下更换节点

我看了一下v2的后台,看起来claude根本就没走代理

hjnnjh commented 6 months ago

xyjandyyk

但是在MacBook上通过docker部署是走代理的,后台有记录

binaryYuki commented 6 months ago

xyjandyyk

但是在MacBook上通过docker部署是走代理的,后台有记录

我这两天尝试复现一下 我初步怀疑是 代理路由流量配置不对/docker内部桥接没代理部分流量 建议这两天可以先用oneapi代理 顺便问一下 您的 clash 是 stack 安装的吗还是 apt

hjnnjh commented 6 months ago

xyjandyyk

但是在MacBook上通过docker部署是走代理的,后台有记录

我这两天尝试复现一下 我初步怀疑是 代理路由流量配置不对/docker内部桥接没代理部分流量 建议这两天可以先用oneapi代理 顺便问一下 您的 clash 是 stack 安装的吗还是 apt

Ubuntu上的clash是这样安装的: image

better1593 commented 6 months ago

我在我的win笔记本上遇到了同样的错误,用的clash美国节点,Chatgpt正常使用,Claude3尝试了好久一直报错

Traceback (most recent call last):
  File "E:\Program Files (x86)\gpt_academic-frontier\request_llms\bridge_claude.py", line 164, in predict
    stream = anthropic.messages.create(
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anthropic\_utils\_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anthropic\resources\messages.py", line 678, in create
    return self._post(
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anthropic\_base_client.py", line 1208, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anthropic\_base_client.py", line 897, in request
    return self._request(
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anthropic\_base_client.py", line 988, in _request
    raise self._make_status_error_from_response(err.response) from None
anthropic.PermissionDeniedError: Error code: 403 - {'error': {'type': 'forbidden', 'message': 'Request not allowed'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\gradio\routes.py", line 422, in run_predict
    output = await app.get_blocks().process_api(
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\gradio\blocks.py", line 1323, in process_api
    result = await self.call_function(
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\gradio\blocks.py", line 1067, in call_function
    prediction = await utils.async_iteration(iterator)
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\gradio\utils.py", line 336, in async_iteration
    return await iterator.__anext__()
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\gradio\utils.py", line 329, in __anext__
    return await anyio.to_thread.run_sync(
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anyio\to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anyio\_backends\_asyncio.py", line 2134, in run_sync_in_worker_thread
    return await future
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anyio\_backends\_asyncio.py", line 851, in run
    result = context.run(func, *args)
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\gradio\utils.py", line 312, in run_sync_iterator_async
    return next(iterator)
  File "E:\Program Files (x86)\gpt_academic-frontier\toolbox.py", line 126, in decorated
    yield from f(txt_passon, llm_kwargs, plugin_kwargs, chatbot_with_cookie, history, system_prompt, *args)
  File "E:\Program Files (x86)\gpt_academic-frontier\request_llms\bridge_all.py", line 894, in predict
    yield from method(inputs, llm_kwargs, *args, **kwargs)
  File "E:\Program Files (x86)\gpt_academic-frontier\request_llms\bridge_claude.py", line 178, in predict
    if retry > MAX_RETRY: raise TimeoutError
```
[Local Message] Request timeout. Network error. Please check proxy settings in config.py.网络错误,检查代理服务器是否可用,以及代理设置的格式是否正确,格式须是[协议]://[地址]:[端口],缺一不可。
```
xyjandyyk commented 6 months ago

我在我的win笔记本上遇到了同样的错误,用的clash美国节点,Chatgpt正常使用,Claude3尝试了好久一直报错

Traceback (most recent call last):
  File "E:\Program Files (x86)\gpt_academic-frontier\request_llms\bridge_claude.py", line 164, in predict
    stream = anthropic.messages.create(
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anthropic\_utils\_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anthropic\resources\messages.py", line 678, in create
    return self._post(
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anthropic\_base_client.py", line 1208, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anthropic\_base_client.py", line 897, in request
    return self._request(
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anthropic\_base_client.py", line 988, in _request
    raise self._make_status_error_from_response(err.response) from None
anthropic.PermissionDeniedError: Error code: 403 - {'error': {'type': 'forbidden', 'message': 'Request not allowed'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\gradio\routes.py", line 422, in run_predict
    output = await app.get_blocks().process_api(
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\gradio\blocks.py", line 1323, in process_api
    result = await self.call_function(
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\gradio\blocks.py", line 1067, in call_function
    prediction = await utils.async_iteration(iterator)
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\gradio\utils.py", line 336, in async_iteration
    return await iterator.__anext__()
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\gradio\utils.py", line 329, in __anext__
    return await anyio.to_thread.run_sync(
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anyio\to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anyio\_backends\_asyncio.py", line 2134, in run_sync_in_worker_thread
    return await future
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\anyio\_backends\_asyncio.py", line 851, in run
    result = context.run(func, *args)
  File "E:\Anaconda\envs\gpt_academic\lib\site-packages\gradio\utils.py", line 312, in run_sync_iterator_async
    return next(iterator)
  File "E:\Program Files (x86)\gpt_academic-frontier\toolbox.py", line 126, in decorated
    yield from f(txt_passon, llm_kwargs, plugin_kwargs, chatbot_with_cookie, history, system_prompt, *args)
  File "E:\Program Files (x86)\gpt_academic-frontier\request_llms\bridge_all.py", line 894, in predict
    yield from method(inputs, llm_kwargs, *args, **kwargs)
  File "E:\Program Files (x86)\gpt_academic-frontier\request_llms\bridge_claude.py", line 178, in predict
    if retry > MAX_RETRY: raise TimeoutError
```
[Local Message] Request timeout. Network error. Please check proxy settings in config.py.网络错误,检查代理服务器是否可用,以及代理设置的格式是否正确,格式须是[协议]://[地址]:[端口],缺一不可。
```

我看代理后台,这似乎并没有代理的流量,感觉要再设置一下

Menghuan1918 commented 6 months ago

https://github.com/binary-husky/gpt_academic/issues/1627#issuecomment-2005712635 其实不是设置的问题....是以前的请求直接调用的anthropic包,但在这个包不走代理

hjnnjh commented 6 months ago

#1627 (comment) 其实不是设置的问题....是以前的请求直接调用的anthropic包,但在这个包不走代理

那为什么Mac上能走代理啊?难道是不同OS下的anthropic包有区别? image

xyjandyyk commented 6 months ago

#1627 (comment) 其实不是设置的问题....是以前的请求直接调用的anthropic包,但在这个包不走代理

这个后面会打补丁吗,或者现在有什么比较好的解决方案

hjnnjh commented 6 months ago

#1627 (comment) 其实不是设置的问题....是以前的请求直接调用的anthropic包,但在这个包不走代理

这个后面会打补丁吗,或者现在有什么比较好的解决方案

看了下anthropic的repo,应该是可以设置代理的:Configuring the HTTP client,不知道当时那个大佬PR的版本有没有考虑这个。

hjnnjh commented 6 months ago

#1627 (comment) 其实不是设置的问题....是以前的请求直接调用的anthropic包,但在这个包不走代理

这个后面会打补丁吗,或者现在有什么比较好的解决方案

看到大佬 @Menghuan1918 的PR了(#1641 ),merge了应该就没问题了。

xyjandyyk commented 6 months ago

#1627 (comment) 其实不是设置的问题....是以前的请求直接调用的anthropic包,但在这个包不走代理

这个后面会打补丁吗,或者现在有什么比较好的解决方案

看到大佬 @Menghuan1918 的PR了(#1641 ),merge了应该就没问题了。

好使了