microsoft / promptflow

Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.
https://microsoft.github.io/promptflow/
MIT License
9.39k stars 855 forks source link

[BUG] The content of assistant is <promptflow.tracing._trace.TracedIterator object at 0x………>, which causes the chat_history passed to LLM to be wrong #3812

Open 21-10-4 opened 1 week ago

21-10-4 commented 1 week ago

Describe the bug A clear and concise description of the bug. 通过 pf flow test --flow new-chat-flow-created-at-2024-10-11 --interactive在终端进行交互式问答。 问答中,发现AI好像没有上下记忆,经过trace发现assistant 部分的content都是 <promptflow.tracing._trace.TracedIterator object at 0x………>,只有assistant的最后一次回答的content是正确的 Image Image

How To Reproduce the bug Steps to reproduce the behavior, how frequent can you experience the bug:

  1. create new flow->chat flow with a template

Expected behavior A clear and concise description of what you expected to happen.

Screenshots If applicable, add screenshots to help explain your problem.

Running Information(please complete the following information):

Executable '/home/xxx/anaconda3/envs/reflextion/bin/python' Python (Linux) 3.8.19 (default, Mar 20 2024, 19:58:24) [GCC 11.2.0]



**Additional context**
Add any other context about the problem here.
dburik commented 5 days ago

I'm experiencing the same issue with Prompt flow 1.16.1. Using the following code as a workaround:

from promptflow.tracing.contracts.iterator_proxy import IteratorProxy

if isinstance(output, IteratorProxy):
    output= "".join(output.items)

Can someone from Prompt flow team provide a comment on this? Is this a bug or an intended but undocumented change in the API?