google-gemini / generative-ai-python

The official Python library for the Google Gemini API
https://pypi.org/project/google-generativeai/
Apache License 2.0
1.23k stars 235 forks source link

stream=True returns all chunks at the same time [in Colab] #329

Closed sykp241095 closed 1 month ago

sykp241095 commented 2 months ago

Description of the bug:

I have read the docs here https://ai.google.dev/api/python/google/generativeai/GenerativeModel to learn that use stream=True could return chunks of response one by one, but when I was trying with the following simple code, it doesn't work, it return chunks, but it seems like to return all the chunks at the same time:

import google.generativeai as genai

genai.configure(api_key='....')
model = genai.GenerativeModel('models/gemini-pro')

response = model.generate_content('generate a long story, more than 200 words, multiple graphs with breakline', stream=True)

for chunk in response:
    print(chunk.text)

Actual vs expected behavior:

No response

Any other information you'd like to share?

No response

MarkDaoust commented 2 months ago

@sykp241095 thanks for reporting.

This is a limitation of Colab, long story. Try anywhere except Colab and you should get the chunks back as they are generated. Let me know if that works for you.

github-actions[bot] commented 1 month ago

Marking this issue as stale since it has been open for 14 days with no activity. This issue will be closed if no further activity occurs.

github-actions[bot] commented 1 month ago

This issue was closed because it has been inactive for 28 days. Please post a new issue if you need further assistance. Thanks!