Open hh1562535601 opened 4 weeks ago
Hello @hh1562535601! I'm here to assist you with any bugs, questions, or contributions you have for the project. Let's work together to resolve the issue you're facing.
The error 'Unterminated string starting at: line 1 column 2 (char 1)' in the streaming function call support for openai_api_compatible
LLM when used in an agent is likely caused by an issue in the _handle_generate_stream_response
method. This error occurs when attempting to parse a JSON string from the streamed response, and the decoded_chunk
is not a valid JSON string. This can happen if the response contains malformed JSON or incomplete data. The code attempts to handle this by yielding a result chunk with a finish reason of "Non-JSON encountered" and then breaking out of the loop [1].
To continue talking to Dosu, mention @dosu.
I change the code
diff --git a/api/core/model_runtime/model_providers/openai_api_compatible/llm/llm.py b/api/core/model_runtime/model_providers/openai_api_compatible/llm/llm.py
index 356ac56b1..632f18248 100644
--- a/api/core/model_runtime/model_providers/openai_api_compatible/llm/llm.py
+++ b/api/core/model_runtime/model_providers/openai_api_compatible/llm/llm.py
@@ -438,7 +438,7 @@ class OAIAPICompatLargeLanguageModel(_CommonOaiApiCompat, LargeLanguageModel):
for new_tool_call in new_tool_calls:
# get tool call
- tool_call = get_tool_call(new_tool_call.function.name)
+ tool_call = get_tool_call(new_tool_call.id)
# update tool call
if new_tool_call.id:
tool_call.id = new_tool_call.id
then it works @dosu
Self Checks
Dify version
main branch
Cloud or Self Hosted
Self Hosted (Source)
Steps to reproduce
First, configure an llm in openai_api_compatible, supporting streaming function call; Secondly, use the llm in an agent, Then query the agent
✔️ Expected Behavior
The agent answer me correctly
❌ Actual Behavior
report 'Unterminated string starting at: line 1 column 2 (char 1)' on the webui