langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
52.9k stars 7.72k forks source link

In Agent model ,when the custom tool responds with too large data to be processed, with 'It seems like you want me to continue an interaction but haven't provided the initial' #10909

Open qq583336273 opened 6 days ago

qq583336273 commented 6 days ago

Self Checks

Dify version

0.9.2

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

In Agent model ,when the custom tool responds with too large data to be processed, with 'It seems like you want me to continue an interaction but haven't provided the initial'

✔️ Expected Behavior

show me the data,but it doesn't work

❌ Actual Behavior

show with 'It seems like you want me to continue an interaction but haven't provided the initial' or something else

dosubot[bot] commented 6 days ago

@qq583336273 Hello there! I'm here to help you with any bugs, questions, or contributions you might have. I'll be assisting you while we wait for a human maintainer.

I found a similar unsolved discussion that might be related to your issue: Help on Tools response when failed, which was last updated on November 19, 2024 [1].

Regarding your issue, it seems like the Dify Agent model might be encountering a problem when handling large data responses from custom tools. The error message you're seeing could be related to data size limits or handling logic within the Agent model. The logic for handling responses from custom tools is implemented in the FunctionCallAgentRunner class within the api/core/agent/fc_agent_runner.py file. This includes processing tool responses and updating the agent's thoughts accordingly [2].

Additionally, the Dify codebase includes error messages related to data size issues, such as OutputValidationError, which is raised when data exceeds predefined limits [3]. You might want to check if the data size exceeds any configured limits or if there are specific error messages in your logs that could provide more insight.

To continue talking to Dosu, mention @dosu.