google-gemini / generative-ai-python

The official Python library for the Google Gemini API
https://pypi.org/project/google-generativeai/
Apache License 2.0
1.62k stars 322 forks source link

chat.py example not working #595

Open wwparzival opened 1 month ago

wwparzival commented 1 month ago

Description of the bug:

When I execute the example of the "chat.py" code, only the first execution of the command "response = chat.send_message(...)" works. Every further execution of the "send_message" function in the multi-turn chat fails.

Debug output attached. debug.txt

gmKeshari commented 1 month ago

Hi @wwparzival

It seems a Content object is being passed instead of the expected input type in the send_message method. I recommend following this doc.

If you still face issues, please provide your code snippet to replicate the same.

wwparzival commented 1 month ago

Hi @gmKeshari,

I used the same code as in the example of the "chat.py".

model = genai.GenerativeModel("gemini-1.5-flash") chat = model.start_chat( history=[ {"role": "user", "parts": "Hello"}, {"role": "model", "parts": "Great to meet you. What would you like to know?"}, ] ) response = chat.send_message("I have 2 dogs in my house.") print(response.text) response = chat.send_message("How many paws are in my house?") print(response.text)

When I execute this code, I get the error output as already posted in the previous message.

gmKeshari commented 1 month ago

Hi @wwparzival,

The same code is working fine for me. Can you follow this gist file

Try upgrading the SDK or temporarily switching to another model.

Let see if you are still facing the same issue.