zhayujie / chatgpt-on-wechat

基于大模型搭建的聊天机器人,同时支持 微信公众号、企业微信应用、飞书、钉钉 等接入,可选择GPT3.5/GPT-4o/GPT-o1/ Claude/文心一言/讯飞星火/通义千问/ Gemini/GLM-4/Claude/Kimi/LinkAI,能处理文本、语音和图片,访问操作系统和互联网,支持基于自有知识库进行定制企业智能客服。
https://docs.link-ai.tech/cow
MIT License
30.26k stars 7.96k forks source link

百度模型出现提示,但是没错误 #2006

Closed taozhiyuai closed 4 months ago

taozhiyuai commented 4 months ago

前置确认

⚠️ 搜索issues中是否已存在类似问题

操作系统类型?

MacOS

运行的python版本是?

python 3.10

使用的chatgpt-on-wechat版本是?

Latest Release

运行的channel类型是?

wx(个人微信, itchat)

复现步骤 🕹

Start auto replying. [INFO][2024-05-22 11:52:37][bridge.py:61] - create bot chatGPT for chat [INFO][2024-05-22 11:52:37][chat_gpt_bot.py:49] - [CHATGPT] query=你好 [WARNING][2024-05-22 11:52:37][chat_gpt_session.py:86] - num_tokens_from_messages() is not implemented for model ERNIE-Speed-128K. Returning num tokens assuming gpt-3.5-turbo. [WARNING][2024-05-22 11:52:38][chat_gpt_session.py:86] - num_tokens_from_messages() is not implemented for model ERNIE-Speed-128K. Returning num tokens assuming gpt-3.5-turbo. [INFO][2024-05-22 11:52:39][wechat_channel.py:214] - [WX] sendMsg=Reply(type=TEXT, content=[bot] 你好!有什么我可以帮助你的吗?), receiver=@dfdaa62c82bafd6296de3969f830de003f27b6da2e58b6cb870a7b98ddfb35ee

问题描述 😯

正常运行,但是出现这样的提示

终端日志 📒

Start auto replying.
[INFO][2024-05-22 11:52:37][bridge.py:61] - create bot chatGPT for chat
[INFO][2024-05-22 11:52:37][chat_gpt_bot.py:49] - [CHATGPT] query=你好
[WARNING][2024-05-22 11:52:37][chat_gpt_session.py:86] - num_tokens_from_messages() is not implemented for model ERNIE-Speed-128K. Returning num tokens assuming gpt-3.5-turbo.
[WARNING][2024-05-22 11:52:38][chat_gpt_session.py:86] - num_tokens_from_messages() is not implemented for model ERNIE-Speed-128K. Returning num tokens assuming gpt-3.5-turbo.
[INFO][2024-05-22 11:52:39][wechat_channel.py:214] - [WX] sendMsg=Reply(type=TEXT, content=[bot] 你好!有什么我可以帮助你的吗?), receiver=@dfdaa62c82bafd6296de3969f830de003f27b6da2e58b6cb870a7b98ddfb35ee
6vision commented 4 months ago

model 填 wenxin ,或者 wenxin-4 看看是不是填错了

taozhiyuai commented 4 months ago

model 填 wenxin ,或者 wenxin-4 看看是不是填错了

"baidu_wenxin_model": "ERNIE-Speed-128K", "baidu_wenxin_api_key": "VJlSL7u9aGY9", "baidu_wenxin_secret_key": "I0ePWhcLafLGXF3eye4", "channel_type": "wx", "model": "wenxin",

会调用ERNIE-Speed-128K免费模型,是吧?

这样吗?@6vision

taozhiyuai commented 4 months ago

还是出问题.@6vision

`Start auto replying. [INFO][2024-05-22 18:09:49][bridge.py:61] - create bot baidu for chat [INFO][2024-05-22 18:09:49][baidu_wenxin.py:28] - [BAIDU] query=你好 [INFO][2024-05-22 18:09:49][baidu_wenxin.py:66] - [BAIDU] model=ERNIE-Speed-128K [INFO][2024-05-22 18:09:49][baidu_wenxin.py:82] - [BAIDU] response text={'error_code': 3, 'error_msg': 'Unsupported openapi method'} [WARNING][2024-05-22 18:09:49][baidu_wenxin.py:94] - [BAIDU] Exception: 'result' [ERROR][2024-05-22 18:09:49][chat_channel.py:303] - Worker return exception: 'total_tokens' Traceback (most recent call last): File "/Users/taozhiyu/miniconda3/envs/cow/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/taozhiyu/Downloads/chatgpt-on-wechat/channel/chat_channel.py", line 170, in _handle reply = self._generate_reply(context) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/taozhiyu/Downloads/chatgpt-on-wechat/channel/chat_channel.py", line 193, in _generate_reply reply = super().build_reply_content(context.content, context) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/taozhiyu/Downloads/chatgpt-on-wechat/channel/channel.py", line 38, in build_reply_content return Bridge().fetch_reply_content(query, context) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/taozhiyu/Downloads/chatgpt-on-wechat/bridge/bridge.py", line 76, in fetch_reply_content return self.get_bot("chat").reply(query, context) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/taozhiyu/Downloads/chatgpt-on-wechat/bot/baidu/baidu_wenxin.py", line 41, in reply result["total_tokens"],


KeyError: 'total_tokens'`
taozhiyuai commented 4 months ago

wenxin-4 倒是可以