Closed jmugan closed 4 months ago
Hey @jmugan,
I just did a clean install in the new venv
and can't replicate this issue:
python3 -m venv venv
source venv/bin/activate
pip install -U langchain-google-vertexai
python3 240.py
answer='A pound of bricks and a pound of feathers weigh the same.' justification='They both weigh one pound.'
Also, there are two things that could be improved:
llm.with_structured_output(AnswerWithJustification).invoke("text")
will return you the instance of AnswerWithJustification
, so response.content
will fail, as AnswerWithJustification
doesn't have content
field.gemini-1.0-pro-002
https://cloud.google.com/vertex-ai/generative-ai/docs/learn/model-versioning#stable-versionSo the code will look like this:
from langchain_google_vertexai import ChatVertexAI
from langchain_core.pydantic_v1 import BaseModel
class AnswerWithJustification(BaseModel):
'''An answer to the user question along with justification for the answer.'''
answer: str
justification: str
llm = ChatVertexAI(model="gemini-1.0-pro-002", temperature=0)
structured_llm = llm.with_structured_output(AnswerWithJustification)
response = structured_llm.invoke("What weighs more a pound of bricks or a pound of feathers")
print(response)
Thanks for your response. That code gives me the same error. In a new environment, it runs with no problem, but I'm trying to use it in an existing environment. So there must be something with the compiling of the protobuf. It needs to be recompiled somehow with a new version of some library, but I can't figure out which one. I don't know why Google loves protobuf so much.
Maybe try pip uninstall libraries in question (incl. google-cloud-aiplatform
) from your env first and then install them again.
Closing it for now, since it doesn't look the issue is on the integration side.
I get
from
Probably a version mismatch with protobuf silliness. I did
pip install -U
on everything I could think of. Any suggestions?