run-llama / llama_deploy

Deploy your agentic worfklows to production
https://docs.llamaindex.ai/en/stable/module_guides/llama_deploy/
MIT License
1.86k stars 193 forks source link

fix: actually stream events from the async client #319

Closed masci closed 1 month ago

masci commented 1 month ago

Fixes https://github.com/run-llama/llama_deploy/issues/308

Changes:

coveralls commented 1 month ago

Coverage Status

coverage: 65.828% (+0.2%) from 65.592% when pulling 503387d2b8580bb17a6250b67f81fa191961cccf on massi/debug-streaming into 36adfa3e437339bb6a1a7ea16af736ab83613a65 on main.