Closed rajdeepint closed 1 month ago
@rajdeepint: the notebook you referred above is for Google AI for Gemini API, which is different from Vertex AI for Gemini API.
For the response_logprobs, can you please try it again to see if you still see the "Unknown field" error? This is a preview feature, and it's only available in some models (https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/inference)
Hey @gericdong! I'm getting Unknown field for GenerationConfig: response_logprobs
in Vertex AI
This is how I'm setting the model up, I'm using gemini-1.5-flash
which should support the feature according to your documentation:
import vertexai
from vertexai.generative_models import GenerativeModel
generation_config={
"temperature": 0,
"top_k": 1,
"max_output_tokens": 25,
"response_logprobs": True
}
model = GenerativeModel(model_name="gemini-1.5-flash", generation_config=generation_config)
response = model.generate_content(prompt)
Unable to get response_logprobs in response body - unrecognized parameter
Erroneous Behavior
According to this cookbook, the following should work -
But I receive the following error -
Which tells me that the API has changed.
Additionally, my company's internal offering receives the following error if I replicate the
generation_config
line -I get the following error -
Although it works if I replace generation_config with other parameters such as -