Closed sykp241095 closed 1 month ago
@sykp241095 thanks for reporting.
This is a limitation of Colab, long story. Try anywhere except Colab and you should get the chunks back as they are generated. Let me know if that works for you.
Marking this issue as stale since it has been open for 14 days with no activity. This issue will be closed if no further activity occurs.
This issue was closed because it has been inactive for 28 days. Please post a new issue if you need further assistance. Thanks!
Description of the bug:
I have read the docs here https://ai.google.dev/api/python/google/generativeai/GenerativeModel to learn that use
stream=True
could return chunks of response one by one, but when I was trying with the following simple code, it doesn't work, it return chunks, but it seems like to return all the chunks at the same time:Actual vs expected behavior:
No response
Any other information you'd like to share?
No response