GoogleCloudPlatform / vertex-ai-samples

Notebooks, code samples, sample apps, and other resources that demonstrate how to use, develop and manage machine learning and generative AI workflows using Google Cloud Vertex AI.
https://cloud.google.com/vertex-ai
Apache License 2.0
115 stars 27 forks source link

Logprobs does not work with Python #3629

Closed rajdeepint closed 1 month ago

rajdeepint commented 1 month ago

Unable to get response_logprobs in response body - unrecognized parameter

Erroneous Behavior

According to this cookbook, the following should work -

import google.generativeai as genai

model = genai.GenerativeModel(model_name)
test_prompt="Why don't people have tails?"

response = model.generate_content(
    test_prompt,
    generation_config=dict(response_logprobs=True, logprobs=5)
)

But I receive the following error -

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
Cell In[79], line 1
----> 1 model = genai.GenerativeModel(model_name)
      2 test_prompt="Why don't people have tails?"
      4 response = model.generate_content(
      5     test_prompt,
      6     generation_config=dict(response_logprobs=True, logprobs=5)
      7 )

AttributeError: module 'google.generativeai' has no attribute 'GenerativeModel'

Which tells me that the API has changed.

Additionally, my company's internal offering receives the following error if I replicate the generation_config line -

vertexai.init(project="dev-poc-xxxyyy", location="x-y")
model = GenerativeModel(
    "gemini-1.5-flash-001",
)
responses = model.generate_content(
    [prompt],
    generation_config=dict(response_logprobs=True, logprobs=5),
    safety_settings=safety_settings,
    stream=False
)

I get the following error -

ValueError: Unknown field for GenerationConfig: response_logprobs

Although it works if I replace generation_config with other parameters such as -

generation_config = {
    "candidate_count": 1,
    "max_output_tokens": 1024,
    "temperature": 0.2,
    "top_p": 0.8
}
gericdong commented 1 month ago

@rajdeepint: the notebook you referred above is for Google AI for Gemini API, which is different from Vertex AI for Gemini API.

For the response_logprobs, can you please try it again to see if you still see the "Unknown field" error? This is a preview feature, and it's only available in some models (https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/inference)

marcros commented 2 weeks ago

Hey @gericdong! I'm getting Unknown field for GenerationConfig: response_logprobs in Vertex AI

This is how I'm setting the model up, I'm using gemini-1.5-flash which should support the feature according to your documentation:

import vertexai
from vertexai.generative_models import GenerativeModel

generation_config={
    "temperature": 0,
    "top_k": 1,
    "max_output_tokens": 25,
    "response_logprobs": True
}

model = GenerativeModel(model_name="gemini-1.5-flash", generation_config=generation_config)
response = model.generate_content(prompt)