Hello, i was wondering if there is a way to use langchain with the system instructions supported by the new gemini models for example: gemini-1.5-flash-001
like shown in this example:
import vertexai
from vertexai.generative_models import GenerativeModel
vertexai.init(project=project_id, location="us-central1")
model = GenerativeModel(
model_name="gemini-1.5-flash-preview-0514",
**system_instruction=[
"You are a helpful language translator.",
"Your mission is to translate text in English to French.",
],**
)
prompt = """
User input: I like bagels.
Answer:
"""
contents = [prompt]
response = model.generate_content(contents)
print(response.text)
Hello, i was wondering if there is a way to use langchain with the system instructions supported by the new gemini models for example: gemini-1.5-flash-001
like shown in this example:
ref: https://cloud.google.com/vertex-ai/generative-ai/docs/samples/generativeaionvertexai-gemini-system-instruction?hl=es-419