Closed meet2602 closed 11 months ago
This may be caused by the size of your image. Actually the limit size of images supported by gemini-pro-vision is 4MB.
I'm running into this as well when inputting images that are less than 4Mb. I'm using the gemini-pro-vision model, which throws the stacktrace on inputs less than 4Mb:
com.google.ai.client.generativeai.type.ServerException: Request payload size exceeds the limit: 4194304 bytes.
I noticed the sample app uses images of 768x768, which does work.
I'm running into this as well when inputting images that are less than 4Mb. I'm using the gemini-pro-vision model, which throws the stacktrace on inputs less than 4Mb:
com.google.ai.client.generativeai.type.ServerException: Request payload size exceeds the limit: 4194304 bytes.
I noticed the sample app uses images of 768x768, which does work.
Hmm that's interesting. Would you be able to provide an example image that causes this?
So all I'm doing is capturing an image using MediaStore's ACTION_IMAGE_CAPTURE intent. This creates a ~6Mb picture for my test device. I'm also focusing on popular 4:3 aspect ratios, so I scale the image down to meet the 4Mb limit, while keeping the ratio.
2048x1536 does not work, even though it is 2.5Mb in the file system
The largest image I have got to work is 1152x864, which is 1.8Mb.
This is all with only one image in the content builder, no text prompting. I have tested with .jpg and .bmp, jpeg compression was:
bmp.compress(Bitmap.CompressFormat.JPEG, 80, stream)
Interestingly, if I take a completely black picture, it does process at the higher resolutions, and does correctly limit payloads larger than 4Mb.
The ImagePart
is provided as a convenience to make sending images easier, it automatically converts the images to a PNG formatting of max quality. If you are custom-compressing the image to fit a size limit, it may be better for you to use BlobPart
so you can manually specify and control the encoding and compression.
Also of note, is that images get converted to Base64 while being sent to the server, increasing their size by a further 35%
However, your example of a file that is 2.5mb on the file system should work if sent as a BlobPart (since this would be approximately 3.4MB).
Your code example can be changed like this
val baos = ByteArrayOutputStream()
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, baos)
val inputContent = content {
blob("image/jpeg", boas.toByteArray())
text("Read text from image")
}
Thanks for the response, that does answer some questions. I think in that case the Android documentation is a bit deceptive: https://ai.google.dev/docs/gemini_api_overview#image_requirements
"Images must be in one of the following image data MIME types:
PNG - image/png JPEG - image/jpeg WEBP - image/webp HEIC - image/heic HEIF - image/heif"
From what I can find, the documentation doesn't describe the BlobPart, and the code documentation doesn't differentiate between BlobPart and ImagePart, it says 'represents binary' or 'represents image' data. Is there more documentation on this?