baidubce / bce-qianfan-sdk

Provide best practices for LMOps, as well as elegant and convenient access to the features of the Qianfan MaaS Platform. (提供大模型工具链最佳实践,以及优雅且便捷地访问千帆大模型平台)
https://cloud.baidu.com/doc/WENXINWORKSHOP/index.html
Apache License 2.0
331 stars 49 forks source link

qianfan.errors.APIError: api return error, req_id: as-3euhp023ev code: 336003, msg: function_call's name can't be blank #576

Closed Puiching-Memory closed 5 months ago

Puiching-Memory commented 5 months ago

System Info

qianfan=0.3.14 python=3.11.9 os=windows11

Reproduction

https://github.com/langchain-ai/langgraph/discussions/575 当我运行下面的代码有一个错误,我不清楚我做错了什么。 我正在使用千帆平台提供的文心一言3构建Langgraph。在这个图中,我直接判断GPT的输出,如果不成功,我希望可以重新生成。 判断函数在另一个服务器上编写,该服务器将使用一个执行器运行GPT生成的代码,返回错误报告或成功运行。

我认为问题发生在将错误反馈到GPT进行修改的步骤。

以下是我的代码

import os
from typing import List
from langchain_community.chat_models import QianfanChatEndpoint
from langchain_core.language_models.chat_models import HumanMessage
from langchain_core.messages.base import BaseMessage
from langgraph.graph import END, MessageGraph
import requests

def is_manim_work(state:List[BaseMessage]):
    SERVER_MANIM = 'http://127.0.0.1:5103'
    se1 = requests.post(SERVER_MANIM+'/task',json={'task':state[1].content})
    print(state[1].content)
    print(se1.json())
    if se1.json()['rencode'] == 0:
        return 'end'
    else:
        return 'continue'

def chat_mainm(ask:str):
    # Chat stream, integrated with LangChain
    chat_QianFan = QianfanChatEndpoint(
        streaming=False,
        model="ERNIE-3.5-preview",
    )
    builder = MessageGraph()
    builder.add_node("oracle", chat_QianFan)
    builder.add_conditional_edges(
    "oracle",
    is_manim_work,
    {
    "continue": "oracle",
    "end": END,
    },)

    builder.set_entry_point("oracle")
    runnable = builder.compile()

    reply = runnable.invoke(HumanMessage(f"{ask}"))
    print(reply)

if __name__ == '__main__':
    chat_mainm('你好')

错误报告

[ERROR] [06-03 20:22:59] openapi_requestor.py:256 [t:27736]: api request req_id: as-3euhp023ev failed with error code: 336003, err msg: function_call's name can't be blank, please check https://cloud.baidu.com/doc/WENXINWORKSHOP/s/tlmyncueh
Traceback (most recent call last):
  File "d:\GitHub\Sora_datasets_SYM\remote_gpt\engine_qianfan.py", line 75, in <module>
    chat_mainm('你好')
  File "d:\GitHub\Sora_datasets_SYM\remote_gpt\engine_qianfan.py", line 49, in chat_mainm
    reply = runnable.invoke(HumanMessage(f"{ask}"))
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\langgraph\pregel\__init__.py", line 1384, in invoke
    for chunk in self.stream(
  File "D:\Python\3119\Lib\site-packages\langgraph\pregel\__init__.py", line 949, in stream
    _panic_or_proceed(done, inflight, step)
  File "D:\Python\3119\Lib\site-packages\langgraph\pregel\__init__.py", line 1473, in _panic_or_proceed
    raise exc
  File "D:\Python\3119\Lib\concurrent\futures\thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\langgraph\pregel\retry.py", line 66, in run_with_retry
    task.proc.invoke(task.input, task.config)
  File "D:\Python\3119\Lib\site-packages\langchain_core\runnables\base.py", line 2399, in invoke
    input = step.invoke(
            ^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\langchain_core\language_models\chat_models.py", line 170, in invoke
    self.generate_prompt(
  File "D:\Python\3119\Lib\site-packages\langchain_core\language_models\chat_models.py", line 599, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\langchain_core\language_models\chat_models.py", line 456, in generate
    raise e
  File "D:\Python\3119\Lib\site-packages\langchain_core\language_models\chat_models.py", line 446, in generate
    self._generate_with_cache(
  File "D:\Python\3119\Lib\site-packages\langchain_core\language_models\chat_models.py", line 671, in _generate_with_cache
    result = self._generate(
             ^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\langchain_community\chat_models\baidu_qianfan_endpoint.py", line 309, in _generate
    response_payload = self.client.do(**params)
                       ^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\llm\chat_completion.py", line 1046, in do
    resp = self._do(
           ^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\llm\base.py", line 289, in _do
    raise e
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\llm\base.py", line 270, in _do
    resp = self._client.llm(
           ^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\openapi_requestor.py", line 416, in llm
    return self._with_retry(retry_config, _helper)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\base.py", line 419, in _with_retry
    return _retry_wrapper(*args)
           ^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\tenacity\__init__.py", line 330, in wrapped_f
    return self(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\tenacity\__init__.py", line 467, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\tenacity\__init__.py", line 368, in iter
    result = action(retry_state)
             ^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\tenacity\__init__.py", line 390, in <lambda>
    self._add_action_func(lambda rs: rs.outcome.result())
                                     ^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\concurrent\futures\_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\concurrent\futures\_base.py", line 401, in __get_result
    raise self._exception
  File "D:\Python\3119\Lib\site-packages\tenacity\__init__.py", line 470, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\base.py", line 417, in _retry_wrapper
    return func(*args)
           ^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\openapi_requestor.py", line 80, in retry_wrapper
    return func(*args)
           ^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\openapi_requestor.py", line 413, in _helper
    self._request(req, data_postprocess=data_postprocess), token_count
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\base.py", line 178, in wrapper
    resp = func(requestor, request, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\base.py", line 311, in _request
    resp = self._parse_response(body, response)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\openapi_requestor.py", line 276, in _parse_response
    raise e
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\openapi_requestor.py", line 272, in _parse_response
    return super()._parse_response(body, resp)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\base.py", line 362, in _parse_response
    self._check_error(body)
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\openapi_requestor.py", line 266, in _check_error
    raise errors.APIError(error_code, err_msg, req_id)
qianfan.errors.APIError: api return error, req_id: as-3euhp023ev code: 336003, msg: function_call's name can't be blank
Dobiichi-Origami commented 5 months ago

请尝试更新 langchain_core 包到最新版本 (0.2.3 及以上),在这个版本中我们修复了一些 Bug,应该能够解决你的问题

Puiching-Memory commented 5 months ago

抱歉,现在补充我的环境

langchain                 0.2.1
langchain-community       0.2.1
langchain-core            0.2.3
langchain-text-splitters  0.2.0
langgraph                 0.0.60
langsmith                 0.1.67
Dobiichi-Origami commented 5 months ago

-

Sorry,后续确认发现有误,该 Patch 还并未跟随 Langchain Release 更新。现在只能通过手动拉取 Langchain 的主分支源代码并在本地编译的方式使用。

Puiching-Memory commented 5 months ago

在我更新到

langchain                 0.2.2
langchain-community       0.2.2
langchain-core            0.2.4
langchain-experimental    0.0.60
langchain-mistralai       0.1.8
langchain-mongodb         0.1.6
langchain-robocorp        0.0.9.post1
langchain-text-splitters  0.2.1
langgraph                 0.0.62
langsmith                 0.1.71

之后,重新运行此代码。报错已经修复,但是产生了新的报错。

Traceback (most recent call last):
  File "d:\GitHub\Sora_datasets_SYM\remote_gpt\engine_qianfan.py", line 76, in <module>
    chat_mainm('你好')
  File "d:\GitHub\Sora_datasets_SYM\remote_gpt\engine_qianfan.py", line 50, in chat_mainm
    reply = runnable.invoke(HumanMessage(f"{ask}"))
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\langgraph\pregel\__init__.py", line 1384, in invoke
    for chunk in self.stream(
  File "D:\Python\3119\Lib\site-packages\langgraph\pregel\__init__.py", line 949, in stream
    _panic_or_proceed(done, inflight, step)
  File "D:\Python\3119\Lib\site-packages\langgraph\pregel\__init__.py", line 1473, in _panic_or_proceed
    raise exc
  File "D:\Python\3119\Lib\concurrent\futures\thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\langgraph\pregel\retry.py", line 66, in run_with_retry
    task.proc.invoke(task.input, task.config)
  File "D:\Python\3119\Lib\site-packages\langchain_core\runnables\base.py", line 2406, in invoke
    input = step.invoke(input, config, **kwargs)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\langchain_core\language_models\chat_models.py", line 170, in invoke
    self.generate_prompt(
  File "D:\Python\3119\Lib\site-packages\langchain_core\language_models\chat_models.py", line 599, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\langchain_core\language_models\chat_models.py", line 456, in generate
    raise e
  File "D:\Python\3119\Lib\site-packages\langchain_core\language_models\chat_models.py", line 446, in generate
    self._generate_with_cache(
  File "D:\Python\3119\Lib\site-packages\langchain_core\language_models\chat_models.py", line 671, in _generate_with_cache
    result = self._generate(
             ^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\langchain_community\chat_models\baidu_qianfan_endpoint.py", line 320, in _generate
    response_payload = self.client.do(**params)
                       ^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\llm\chat_completion.py", line 1046, in do
    resp = self._do(
           ^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\llm\base.py", line 289, in _do
    raise e
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\llm\base.py", line 270, in _do
    resp = self._client.llm(
           ^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\openapi_requestor.py", line 416, in llm
    return self._with_retry(retry_config, _helper)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\base.py", line 419, in _with_retry
    return _retry_wrapper(*args)
           ^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\tenacity\__init__.py", line 330, in wrapped_f
    return self(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\tenacity\__init__.py", line 467, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\tenacity\__init__.py", line 368, in iter
    result = action(retry_state)
             ^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\tenacity\__init__.py", line 390, in <lambda>
    self._add_action_func(lambda rs: rs.outcome.result())
                                     ^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\concurrent\futures\_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\concurrent\futures\_base.py", line 401, in __get_result
    raise self._exception
  File "D:\Python\3119\Lib\site-packages\tenacity\__init__.py", line 470, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\base.py", line 417, in _retry_wrapper
    return func(*args)
           ^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\openapi_requestor.py", line 80, in retry_wrapper
    return func(*args)
           ^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\openapi_requestor.py", line 413, in _helper
    self._request(req, data_postprocess=data_postprocess), token_count
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\base.py", line 178, in wrapper
    resp = func(requestor, request, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\base.py", line 311, in _request
    resp = self._parse_response(body, response)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\openapi_requestor.py", line 276, in _parse_response
    raise e
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\openapi_requestor.py", line 272, in _parse_response
    return super()._parse_response(body, resp)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\base.py", line 362, in _parse_response
    self._check_error(body)
  File "D:\Python\3119\Lib\site-packages\qianfan\resources\requestor\openapi_requestor.py", line 266, in _check_error
    raise errors.APIError(error_code, err_msg, req_id)
qianfan.errors.APIError: api return error, req_id: as-f1268tsn0v code: 336006, msg: the length of messages must be an odd number

似乎反馈是以问答两条信息进行传递的,这使得上下文变成了偶数。 我是否需要寻找专门的支持偶数对话的GPT,或者是有无方法使得反馈问答变为奇数?

Dobiichi-Origami commented 5 months ago

@Puiching-Memory 感谢你的报告,我们正在着手排查 Langgraph 下的兼容性是否存在问题。

Dobiichi-Origami commented 5 months ago

@Puiching-Memory 在对 Langgraph 进行兼容性的相关测试后,我们并未发现在正常使用场景下的问题 针对于这个场景,你始终应该保持大模型的上下文 + 新一轮对话是奇数条数据。针对性的修改建议是,在每次请求返回结束之后向上下文末尾添加一条 HumanMessage

另外,似乎你的代码中存在 Bug。

se1 = requests.post(SERVER_MANIM+'/task',json={'task':state[1].content})

这里似乎一直在取第一条回答做判断,正确的做法应该是取最后一条也就是 index=-1 处的消息

qslia commented 5 months ago

https://python.langchain.com/v0.2/docs/tutorials/chatbot/#message-history 我按照官方教程message-history走的时候,需要访问message-history的时候也报这个错误。我用的langchain-core-0.2.5

config = {"configurable": {"session_id": "abc2"}}

response = with_message_history.invoke( [HumanMessage(content="What's my name?")], config=config, )

response.content @Dobiichi-Origami

Dobiichi-Origami commented 5 months ago

@qslia 很遗憾你遇到了问题,能麻烦你将错误信息贴出以便定位你的问题吗?

Puiching-Memory commented 5 months ago

@Puiching-Memory 在对 Langgraph 进行兼容性的相关测试后,我们并未发现在正常使用场景下的问题 针对于这个场景,你始终应该保持大模型的上下文 + 新一轮对话是奇数条数据。针对性的修改建议是,在每次请求返回结束之后向上下文末尾添加一条 HumanMessage

另外,似乎你的代码中存在 Bug。

se1 = requests.post(SERVER_MANIM+'/task',json={'task':state[1].content})

这里似乎一直在取第一条回答做判断,正确的做法应该是取最后一条也就是 index=-1 处的消息

依据你的建议,我做了几处修改,现在已经可以正常运行!

修改索引错误:se1 = requests.post(SERVER_MANIM + "/task", json={"task": state[-1].content}) 手动添加HumanMessage:

def add_enpty(state: List[BaseMessage]):
    return HumanMessage(f"依据报错修改你的代码,保持你一开始的回复格式,仅提供你修改过后的代码:{error_report}")

这是通过添加一个新的节点实现的: builder.add_node("enpty", add_enpty) builder.add_edge("enpty", "oracle")