Open bigahega opened 1 month ago
If anyone else comes across until its fixed for good, here is an ugly workaround:
if "tool_calls" in message or message["role"] == "tool":
context.add_message(message)
elif "content" in message:
context.add_message({
"content": message["content"],
"role": message["role"],
"name": message["name"] if "name" in message else message["role"]
})
Monkey patched the OpenAILLMContext.from_messages
function by replacing inside of the for loop with the above hack.
I have the following pipeline definition:
Before the GPT invokes the registered function for the first time, user idle processor works without any problem. However, once the function is called, whenever user idle processor wants to add an llm message it causes an exception.