zhayujie / chatgpt-on-wechat

基于大模型搭建的聊天机器人,同时支持 微信公众号、企业微信应用、飞书、钉钉 等接入,可选择GPT3.5/GPT-4o/GPT-o1/ Claude/文心一言/讯飞星火/通义千问/ Gemini/GLM-4/Claude/Kimi/LinkAI,能处理文本、语音和图片,访问操作系统和互联网,支持基于自有知识库进行定制企业智能客服。
https://docs.link-ai.tech/cow
MIT License
29.85k stars 7.88k forks source link

使用azure api后使用tool报错 #1043

Closed mingkwind closed 2 months ago

mingkwind commented 1 year ago

前置确认

⚠️ 搜索issues中是否已存在类似问题

操作系统类型?

Railway

运行的python版本是?

python 3.7

使用的chatgpt-on-wechat版本是?

Latest Release

运行的channel类型是?

wx(个人微信, itchat)

复现步骤 🕹

我已经设置好了azure api的各项参数,微信机器人也能顺利对话,但是在使用$tool命令的时候遇到错误,railway后台日志显示如下:

File "/usr/local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 83, in __prepare_create_request
raise error.InvalidRequestError(
openai.error.InvalidRequestError: Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.chat_completion.ChatCompletion'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/app/plugins/tool/tool.py", line 104, in on_handle_context
_reply = self.app.ask(query, user_session)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/apps/victorinox.py", line 116, in ask
return self.ask(query, chat_history, retry_num + 1)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/apps/victorinox.py", line 118, in ask
raise TimeoutError("超过重试次数")
TimeoutError: 超过重试次数

问题描述 😯

image

终端日志 📒

return fut.result()
File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 451, in result
return self.__get_result()
File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
raise self._exception
File "/usr/local/lib/python3.10/site-packages/tenacity/__init__.py", line 382, in __call__
result = fn(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/models/chatgpt/chatgpt.py", line 233, in _completion_with_retry
return self.client.create(**kwargs)
File "/usr/local/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create
return super().create(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 149, in create
) = cls.__prepare_create_request(
File "/usr/local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 83, in __prepare_create_request
raise error.InvalidRequestError(
openai.error.InvalidRequestError: Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.chat_completion.ChatCompletion'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/apps/victorinox.py", line 112, in ask
return self.engine.run(query)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/chains/base.py", line 206, in run
return self(args[0])[self.output_keys[0]]
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/chains/base.py", line 112, in __call__
raise e
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/chains/base.py", line 109, in __call__
outputs = self._call(inputs)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/engine/tool_engine.py", line 182, in _call
next_step_output = self._take_next_step(
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/engine/tool_engine.py", line 120, in _take_next_step
output = self.bot.plan(intermediate_steps, **inputs)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/bots/chat_bot/base.py", line 96, in plan
action = self._get_next_action(full_inputs)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/bots/chat_bot/base.py", line 111, in _get_next_action
llm_answer_str = self.llm_chain.predict(**full_inputs)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/chains/llm.py", line 147, in predict
return self(kwargs)[self.output_key]
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/chains/base.py", line 112, in __call__
raise e
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/chains/base.py", line 109, in __call__
outputs = self._call(inputs)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/chains/llm.py", line 57, in _call
return self.apply([inputs])[0]
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/chains/llm.py", line 114, in apply
response = self.generate(input_list)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/chains/llm.py", line 62, in generate
return self.llm.generate_prompt(prompts, stop)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/models/chatgpt/base.py", line 78, in generate_prompt
raise e
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/models/chatgpt/base.py", line 75, in generate_prompt
output = self.generate(prompt_messages, stop=stop)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/models/chatgpt/base.py", line 52, in generate
results = [self._generate(m, stop=stop) for m in messages]
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/models/chatgpt/base.py", line 52, in <listcomp>
results = [self._generate(m, stop=stop) for m in messages]
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/models/chatgpt/chatgpt.py", line 275, in _generate
response = self.completion_with_retry(messages=message_dicts, **params)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/models/chatgpt/chatgpt.py", line 235, in completion_with_retry
return _completion_with_retry(**kwargs)
File "/usr/local/lib/python3.10/site-packages/tenacity/__init__.py", line 289, in wrapped_f
return self(f, *args, **kw)
File "/usr/local/lib/python3.10/site-packages/tenacity/__init__.py", line 379, in __call__
do = self.iter(retry_state=retry_state)
File "/usr/local/lib/python3.10/site-packages/tenacity/__init__.py", line 314, in iter
return fut.result()
File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 451, in result
return self.__get_result()
File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
raise self._exception
File "/usr/local/lib/python3.10/site-packages/tenacity/__init__.py", line 382, in __call__
result = fn(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/models/chatgpt/chatgpt.py", line 233, in _completion_with_retry
return self.client.create(**kwargs)
File "/usr/local/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create
return super().create(*args, **kwargs)
File "/usr/local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 149, in create
) = cls.__prepare_create_request(
File "/usr/local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 83, in __prepare_create_request
raise error.InvalidRequestError(
openai.error.InvalidRequestError: Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.chat_completion.ChatCompletion'>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/app/plugins/tool/tool.py", line 104, in on_handle_context
_reply = self.app.ask(query, user_session)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/apps/victorinox.py", line 116, in ask
return self.ask(query, chat_history, retry_num + 1)
File "/usr/local/lib/python3.10/site-packages/chatgpt_tool_hub/apps/victorinox.py", line 118, in ask
raise TimeoutError("超过重试次数")
TimeoutError: 超过重试次数
[ERROR][2023-05-07 16:53:30][tool.py:109] - 超过重试次数
[INFO][2023-05-07 16:53:30][chat_gpt_bot.py:49] - [CHATGPT] query=请你随机用一种聊天风格,提醒用户:这个问题tool插件暂时无法处理
[INFO][2023-05-07 16:53:34][wechat_channel.py:184] - [WX] sendMsg=Reply(type=TEXT, content=[bot]哎呀,您好呀!很抱歉告诉您,今天我有一个小小的遗憾要与您分享。您刚刚问到的问题我暂时无法为您回答,因为我的tool插件似乎出了一些小问题。不过,您放心哦,我已经通知我的开发团队,他们会尽快解决问题,让我重新为您提供最优质的服务。如果您还有其他问题或者想要聊聊天,我随时都在哦!), receiver=@74927db59e0e576f34ac22346380edb6064bfeab45ec90ec205a6ce04751db65
lanvent commented 1 year ago

tool 还不支持 azure openai

BakaFT commented 1 year ago

截至现在问题依然存在,我来分析一下并提供一个Workaround。

首先说原因

目前pypi上chatgpt-tool-hub最新版本(0.4.4)构造模型参数字典的时候没有考虑deployment_id,导致后续OpenAI的Python库在初始化时认为参数缺失而报错

其实作者已经修复了这个问题

https://github.com/goldfishh/chatgpt-tool-hub/commit/96a98f6f0f0561e76d0f316ec2423b644498e718 中,作者已经修复了这个问题,然而pypi上并没有同步这个更新,依然停留在2023/5/15的版本。所以使用pip安装的依赖仍然存在该问题。

Workaround

丑陋,但是能用。特殊情况下不适用(tool插件下的url-getsummary等工具在新建模型的时候不会被这个覆盖到。考虑不修改依赖代码的情况,只能做到这个地步了) 将 /plugins/tool/tool.py_reset_app() 的 返回语句:

        return app.create_app(tools_list=tool_list, **app_kwargs)

改为:

        patched_app = app.create_app(tools_list=tool_list, **app_kwargs)
        patched_app .llm.model_kwargs['deployment_id'] = tool_config['kwargs']['deployment_id']
        return patched_app 

然后在/plugins/tool/config.json中加入一行:

{
  "tools": [
    "python",
    "url-get",
    "terminal",
    "meteo-weather"
  ],
  "kwargs": {
    // .....
   // 在这里加入你的模型部署ID
    "deployment_id": "gpt35"
  }
}
BakaFT commented 1 year ago

https://github.com/goldfishh/chatgpt-tool-hub/issues/61 现在应该不会报错了