Hi, i am trying to ask the model to summarize a random the document, however the response comes back with 500 from google api. With the context window limit of 1million token, is there any limitation on an individual request or the part length?
Thanks for your report. There are several layers of software involved here; are you able to reproduce this issue when using the Google AI Go SDK directly? And if yes, are you able to reproduce it by using curl?
Hi, i am trying to ask the model to summarize a random the document, however the response comes back with 500 from google api. With the context window limit of 1million token, is there any limitation on an individual request or the part length?
The request body is attached. gemini-request-body.json
Thanks,