Closed mdhicks-colpal closed 7 months ago
If it’s like Text prediction (that’s where I’ve been playing), grounding is exposed through the grounding_source param in the SDK. The Vertex docs are incorrect. See https://github.com/googleapis/python-aiplatform/issues/3077
If it’s like Text prediction (that’s where I’ve been playing), grounding is exposed through the grounding_source param in the SDK. The Vertex docs are incorrect. See #3077
I believe that's correct, but that param is buried deep in the SDK calls in this case and my attempts to shim it in so far have been fruitless.
When i add grounding source argument to the chat.send_message, i cannot add other parameters like top p and top k, it takes only 2 arguments, and grounding config is giving this error in parameters, please suggest a way to add grounding resources as well as pass the paramters in the chat.send_message.
Hi @JashSureja
top_p
, top_k
are at the same level with grounding_souce
. Please see method signature here.
similarly, grounding_source
is at the same level with top_p
, top_k
for TextGenerationModel.predict, see method signature here.
Hi all,
send_message_streaming
At the moment, grounding is not supported for steaming. We will update in the future on support for streaming.
The sample code from console on grounding is misleading, I have already contacted the team to update sample code. Other users also reported the misleading sample in issues3077
For now, please use the following for grounding on TextGenerationModel. If you see other users got misled by the sample code, would appreciate to guide them to this comment. Thank you.
from vertexai.language_models import TextGenerationModel, GroundingSource
from google.colab import auth as google_auth
import vertexai
PROJECT_ID='your project'
LOCATION = "us-central1"
google_auth.authenticate_user(project_id=PROJECT_ID)
vertexai.init(project=PROJECT_ID, location=LOCATION)
text_model_name = "text-bison@001"
prompt = "what is vertex ai"
text_model = TextGenerationModel.from_pretrained(text_model_name)
data_store_id = "your data store ID"# example is "google-cloud_1690936150456"
vertex_location = "you data store location"# example "global"
grounding_souce = GroundingSource.VertexAISearch(data_store_id=data_store_id, location=vertex_location)
es_res = text_model.predict(
prompt,
top_k=10,
top_p=0.9,
grounding_source=grounding_souce,
)
es_res.grounding_metadata, es_res
The above code should output something like (exact answer vary depending on your grounding source)
(GroundingMetadata(citations=[GroundingCitation(start_index=0, end_index=107, url='https://cloud.google.com/vertex-ai/docs/start/introduction-unified-platform', title=None, license=None, publication_date=None)], search_queries=['Vertex AI']),
Vertex AI is a machine learning (ML) platform that lets you train and deploy ML models and AI applications. Vertex AI combines data preparation, model training, model serving, and model monitoring into a single platform.)
On chat
from vertexai.language_models import ChatModel, GroundingSource
from google.colab import auth as google_auth
import vertexai
PROJECT_ID='your project'
LOCATION = "us-central1"
google_auth.authenticate_user(project_id=PROJECT_ID)
vertexai.init(project=PROJECT_ID, location=LOCATION)
chat_model_name = "chat-bison@001"
prompt = "what is vertex ai"
chat_model = ChatModel.from_pretrained(chat_model_name)
data_store_id = "your data store ID"# example is "google-cloud_1690936150456"
vertex_location = "you data store location"# example "global"
chat = chat_model.start_chat()
grounding_source = GroundingSource.VertexAISearch(data_store_id=data_store_id, location=vertex_location)
es_res_chat = chat.send_message(
prompt,
top_k=10,
top_p=0.9,
grounding_source=grounding_source,
)
es_res_chat.grounding_metadata, es_res_chat
The above code should output something like (may not be exactly the same depending on your VertexAISearch data store)
(GroundingMetadata(citations=[GroundingCitation(start_index=0, end_index=107, url='https://cloud.google.com/vertex-ai/docs/start/introduction-unified-platform', title=None, license=None, publication_date=None)], search_queries=['Vertex AI']),
Vertex AI is a machine learning (ML) platform that lets you train and deploy ML models and AI applications. Vertex AI combines data preparation, model training, model serving, and model monitoring into a single unified environment.)
Hi all, Since there's no comment after my last update, I'm closing this issue now. Feel free to re-open if you have further questions.
[Using example code for VertexAI with grounding fails on the latest Python SDK and streaming. Code and stack trace (indicating an unhandled param) below.
If you are still having issues, please be sure to include as much information as possible:
Environment details
google-cloud-aiplatform
version: 1.39.0Steps to reproduce
Code example
Stack trace