langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
53.55k stars 7.83k forks source link

如何配置非流式调用!!!! #2978

Closed Owen-MS closed 8 months ago

Owen-MS commented 8 months ago

Self Checks

Provide a description of the help you need

如何配置非流式调用!!!! class HMLargeLanguageModel(LargeLanguageModel): def _invoke(self, model: str, credentials: dict, prompt_messages: list[PromptMessage], model_parameters: dict, tools: list[PromptMessageTool] | None = None, stop: list[str] | None = None, stream: bool = True, user: str | None = None) \ -> LLMResult | Generator: return self._generate(model=model, credentials=credentials, prompt_messages=prompt_messages, model_parameters=model_parameters, tools=tools, stop=stop, stream=stream, user=user) 默认的stream在哪里能配置为false

dosubot[bot] commented 8 months ago

嘿,@Owen-MS!很高兴再次看到你。希望一切都好!

要在HMLargeLanguageModel类中将默认的stream参数配置为false,您需要修改LargeLanguageModel类中invoke方法的方法签名。具体来说,将stream参数的默认值从True更改为False

以下是修改后的invoke方法签名中的一行:

def invoke(self, model: str, credentials: dict,
           prompt_messages: list[PromptMessage], model_parameters: Optional[dict] = None,
           tools: Optional[list[PromptMessageTool]] = None, stop: Optional[list[str]] = None,
           stream: bool = False, user: Optional[str] = None, callbacks: list[Callback] = None) \
        -> Union[LLMResult, Generator]:

此更改将stream参数的默认行为设置为False,这意味着除非在调用invoke方法时另行指定,否则该方法默认不会流式传输响应。

stream参数在HMLargeLanguageModel类的_invoke方法中决定了大型语言模型的响应应该是流式传输还是完全返回。当stream设置为True时,该方法返回一个生成器,该生成器在接收到响应的块时产生它们。这对于处理大型响应或实时交互场景非常有用,因为在这些场景中,即时的部分结果是有益的。相反,当stream设置为False时,该方法在返回之前等待模型的完全响应,这适用于需要在开始任何处理之前需要完整结果的场景。这种行为在_generate方法中实现,该方法构造对模型的请求,包括相应地为requests.post调用设置stream参数。

请注意,HMLargeLanguageModel类在提供的上下文中不存在。在上下文中定义的类是OpenLLMLargeLanguageModel,它是LargeLanguageModel的子类。LargeLanguageModel类从core.model_runtime.model_providers.__base.large_language_model导入。

源代码:

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.
Owen-MS commented 8 months ago

我需要改的不是当前行代码,而是如何配置stream

crazywoola commented 8 months ago

https://github.com/langgenius/dify/blob/4cf475680db459f985201678b156bb2439f248d2/api/core/model_runtime/model_providers/baichuan/llm/llm.py#L38

参考一下其他类型的 LLM