Closed Naznouz closed 1 week ago
@Naznouz Please use pip freeze
in your environment and reply with the versions of proto-plus
, google-api-core
, and protobuf
pip freeze
:
...
proto-plus==1.23.0
protobuf==3.20.3
...
google-api-core @ file:///Users/ec2-user/ci_py311/google-api-core_1678380873961/work
google-api-python-client==2.97.0
google-auth==2.25.2
google-auth-httplib2==0.1.0
google-cloud-aiplatform @ file:///Users/runner/miniforge3/conda-bld/google-cloud-aiplatform_1702532890117/work
google-cloud-bigquery==3.14.0
google-cloud-core @ file:///Users/ec2-user/ci_py311/google-cloud-core_1678391961515/work
google-cloud-resource-manager==1.11.0
google-cloud-storage==2.14.0
google-crc32c @ file:///Users/ec2-user/ci_py311/google-crc32c_1678380922854/work
google-resumable-media @ file:///Users/ec2-user/ci_py311/google-resumable-media_1678391983737/work
googleapis-common-protos @ file:///Users/ec2-user/ci_py311/googleapis-common-protos-feedstock_1678326429456/work
...
I wasn't able to recreate the issue with:
proto-plus==1.23.0
protobuf==3.20.3
Can you provide the closest Pypi version for the dependencies that are installed from local paths?
@Naznouz This error is a bit puzzling. Could you please check the type of the raw part:
print(type(response.candidates[0].content.parts[0]._raw_part))
Here is the result Used code:
import vertexai
from vertexai.preview.generative_models import GenerativeModel # , Part
def generate():
"""Generates text using the Generative Model."""
# Initialize Vertex AI
vertexai.init(project="upgradecode", location="us-central1")
model = GenerativeModel("gemini-pro")
responses = model.generate_content(
"""hi. I would like to know the 10 first items of the fibonacci
sequence.""",
generation_config={
"max_output_tokens": 2048,
"temperature": 0.9,
"top_p": 1
},
stream=True,
)
for i, response in enumerate(responses):
print("Response `content.parts`:", i)
print(response.candidates[0].content.parts)
# print(repr(response.candidates[0].content.parts[0]).split("\"")[1])
print("Response `content._raw_part`:")
print(response.candidates[0].content.parts[0]._raw_part)
print("Response `content.text`:")
print(response.candidates[0].content.parts[0].text)
generate()
Result:
Response `content.parts`: 0
[text: "1. 0\n2. 1\n3. 1 \n"
]
Response `content._raw_part`:
text: "1. 0\n2. 1\n3. 1 \n"
Response `content.text`:
Traceback (most recent call last):
File "/Users/nizarayed/Documents/002-git/dolibarr-invoice/gcloud_ai.py", line 32, in <module>
generate()
File "/Users/nizarayed/Documents/002-git/dolibarr-invoice/gcloud_ai.py", line 29, in generate
print(response.candidates[0].content.parts[0].text)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/nizarayed/anaconda3/lib/python3.11/site-packages/vertexai/generative_models/_generative_models.py", line 1508, in text
if "text" not in self._raw_part:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: argument of type 'Part' is not iterable
May be some additional information that would help: I'm using a conda environment and here is the list of proto
related librairies:
$ pip freeze | grep proto
googleapis-common-protos @ file:///Users/ec2-user/ci_py311/googleapis-common-protos-feedstock_1678326429456/work
proto-plus==1.23.0
protobuf==3.20.3
$ conda list proto
# packages in environment at /xxx/xxx/anaconda3:
#
# Name Version Build Channel
googleapis-common-protos 1.60.0 pypi_0 pypi
googleapis-common-protos-grpc 1.56.4 py311hecd8cb5_0
libprotobuf 3.20.3 hfff2838_0
proto-plus 1.23.0 pypi_0 pypi
protobuf 4.24.1 pypi_0 pypi
I have made another test using a virtual env and it was fine. So may be, the issue is related to conda package google-cloud-aiplatform (macos):
$ conda list google
# packages in environment at /xxx/xxx/anaconda3:
#
# Name Version Build Channel
google-api-core 2.11.1 pypi_0 pypi
google-api-core-grpc 2.10.1 py311hecd8cb5_0
google-api-python-client 2.97.0 pypi_0 pypi
google-auth 2.25.2 pypi_0 pypi
google-auth-httplib2 0.1.0 pypi_0 pypi
google-cloud-aiplatform 1.38.0 pypi_0 pypi
google-cloud-bigquery 3.14.0 pypi_0 pypi
google-cloud-bigquery-core 3.14.1 pyhd8ed1ab_1 conda-forge
google-cloud-core 2.4.1 pypi_0 pypi
google-cloud-resource-manager 1.11.0 pypi_0 pypi
google-cloud-storage 2.14.0 pypi_0 pypi
google-crc32c 1.5.0 py311h6c40b1e_0
google-resumable-media 2.7.0 pypi_0 pypi
googleapis-common-protos 1.60.0 pypi_0 pypi
googleapis-common-protos-grpc 1.56.4 py311hecd8cb5_0
grpc-google-iam-v1 0.13.0 pyhd8ed1ab_0 conda-forge
Do you still have any environment where the issue reproduces? I'd be happy to investigate and fix it.
Part._raw_response
is (should be) a proto message object. This object should support __contains__
since the proto library was created. So I'm not sure why the error can occur.
I get the same error, by just executing the python code from the playground in my google cloud shell.
Just tried Colab:
google-cloud-aiplatform==1.39.0
googleapis-common-protos==1.62.0
proto-plus==1.23.0
protobuf==3.20.3
@Naznouz What does print(type(response.candidates[0].content.parts[0]._raw_part))
print?
@chrissie303 What does print(type(response.candidates[0].content.parts[0]._raw_part))
print?
@Naznouz What does
print(type(response.candidates[0].content.parts[0]._raw_part))
print?
<class 'Part'>
I believe that it worked on Colab, because it is not based on "Conda" environment, like my case. In addition, what @chrissie303 mentioned is related to Google Cloud shell. @chrissie303 : Could you please tell us what this shell command prints ?
$ pip freeze | grep google
and
$ pip freeze | grep proto
Same problem here. I have Anaconda, so it is probably related to that.
pip freeze | grep google
gives
google-api-core @ file:///opt/conda/conda-bld/google-api-core_1663638620529/work
google-api-python-client==2.7.0
google-auth==2.21.0
google-auth-httplib2==0.1.0
google-auth-oauthlib==1.0.0
google-cloud-aiplatform==1.42.1
google-cloud-bigquery==3.17.2
google-cloud-core @ file:///croot/google-cloud-core_1666887720149/work
google-cloud-resource-manager==1.6.3
google-cloud-storage @ file:///croot/google-cloud-storage_1668473937338/work
google-crc32c @ file:///croot/google-crc32c_1667946620152/work
google-pasta==0.2.0
google-resumable-media @ file:///croot/google-resumable-media_1668208190857/work
google-search-results==2.4.1
googleapis-common-protos @ file:///opt/conda/conda-bld/googleapis-common-protos-feedstock_1660256159732/work
grpc-google-iam-v1==0.13.0
and
pip freeze | grep proto
gives
proto-plus==1.23.0
protobuf==4.23.3
wsproto==1.2.0
I had to do response.candidates[0].content.parts[0]._raw_part.text
to get the response.
check the link https://github.com/google-gemini/generative-ai-python/issues/196#issuecomment-2081481592 , i was able to solve that issue. You cannot directly access the text from the response.
If "text" not in self.raw_part Error argument of type 'Part' is not iterable In cloud shell editor Can you fix this issue
I've tried to reproduce this using Anaconda and could not repro:
$ conda create -n issue_3129 python==3.11
$ conda activate issue_3129
$ conda install google-cloud-aiplatform==1.42.1
$ python3 -c '
from vertexai.generative_models import GenerativeModel
model = GenerativeModel("gemini-pro")
response = model.generate_content("what is life?")
print(response.text)
'
"What is life?" is a profound question that has been pondered by philosophers and scientists for centuries. There is no single answer that everyone agrees on, as the concept of life is complex and multifaceted. However, here are some different perspectives on the nature of life:
Packages:
google-cloud-aiplatform 1.42.1 pyhd8ed1ab_0 conda-forge
libprotobuf 4.25.3 h08a7969_0 conda-forge
proto-plus 1.23.0 pyhd8ed1ab_0 conda-forge
protobuf 4.25.3 pypi_0 pypi
@jmugan @nityasri @Naznouz
Thank you for your patience. Can you please test what does the following print on your systems?
print(type(response.candidates[0].content.parts[0]._raw_part).__module__)
print(type(response.candidates[0].content.parts[0]._raw_part).__contains__)
print(response.candidates[0].content.parts[0]._raw_part.__contains__)
print(response.candidates[0].content.parts[0]._raw_part.__contains__("text"))
Here is what it prints for me:
>>> print(type(response.candidates[0].content.parts[0]._raw_part).__module__)
google.cloud.aiplatform_v1beta1.types.content
>>> print(type(response.candidates[0].content.parts[0]._raw_part).__contains__)
<function Message.__contains__ at 0x7f06d2529440>
>>> print(response.candidates[0].content.parts[0]._raw_part.__contains__)
<bound method Message.__contains__ of text: "Life is a complex and multifaceted concept...
>>> print(response.candidates[0].content.parts[0]._raw_part.__contains__("text"))
True
@Ark-kun I dont have the latest code base and i get this response for print commands:
print(type(response.candidates[0].content.parts[0]._raw_part).__module__) print(type(response.candidates[0].content.parts[0]._raw_part), dir(response.candidates[0].content.parts[0]._raw_part)) print(type(response.candidates[0].content.parts[0]._raw_part).__contains__)
it prints
None <class 'Part'> ['ByteSize', 'Clear', 'ClearExtension', 'ClearField', 'CopyFrom', 'DESCRIPTOR', 'DiscardUnknownFields', 'Extensions', 'FindInitializationErrors', 'FromString', 'HasExtension', 'HasField', 'IsInitialized', 'ListFields', 'MergeFrom', 'MergeFromString', 'ParseFromString', 'RegisterExtension', 'SerializePartialToString', 'SerializeToString', 'SetInParent', 'UnknownFields', 'WhichOneof', '_CheckCalledFromGeneratedFile', '_ListFieldsItemKey', '_SetListener', '__class__', '__deepcopy__', '__delattr__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattribute__', '__getstate__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__le__', '__lt__', '__module__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__setstate__', '__sizeof__', '__slots__', '__str__', '__subclasshook__', '__unicode__'] print(type(response.candidates[0].content.parts[0]._raw_part).__contains__) AttributeError: __contains__
hi, I had the same issue and fixed it by creating brand new conda env with new pip install vertexai
, then the libraries versions were aligned and response.text
worked. My assumption is that the older versions of protobuf and other sub libraries cause this issue, so everything should be of the latest version.
I'm not sure what's happening here. At east in @nikhil2406 case, it looks like the actual class of _raw_part
is somehow a derivative of google.protobuf.Message
instead of proto.Message
. This should not be possible since GAPIC always uses the proto-plus
wrapper library. Also, the proto-plus
messages always had __contains__
method.
Here are the expected types:
part
: vertexai.generative_models.Part
part._raw_part
: google.cloud.aiplatform_v1beta1.types.content.Part(proto.Message)
(derived from proto.Message
from the proto-plus
library)part._raw_part._pb
: google.protobuf.Message
part._raw_part
should never be an instance of google.protobuf.Message
. I've tried to repro in many clean environments (including conda), but was never able to reproduce the issue.
@parthea Do you have any idea what could be happening here?
@Ark-kun I also haven't been able to reproduce this issue. It sounds like it is specific to anaconda only. If anyone is able to share a Dockerfile
with a minimal reproduction, that would be a huge help!
I reproduced this issue on my workstation. I installed SDK using pip3 install google-cloud-aiplatform
. The root case is proto pyext used by Conda not working with GAPIC. The short term solution is to run the following commands to upgrade Conda's proto package version.
conda uninstall protobuf
pip install --force-reinstall "protobuf>=4.25.3,<5.0"
or
conda install "libprotobuf>=4.25.3,<5.0"
pip install --force-reinstall "protobuf>=4.25.3,<5.0"
Please can you clarify if you also see this problem on protobuf 4.x when you set the PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python
environment variable?
Please can you clarify if you also see this problem on protobuf 4.x when you set the PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python environment variable?
Is "protobuf 4.x" a typo? protobuf 4.x is already upb and if text in part
works in 4.x. The error is only seen in protobuf 3.x.
When we execute the code from the test playground for Gemini in Vertex AI Studio, there is a type error. Actually,
_raw_part
is not a dict. It is of typePart
, as an extension ofproto.Message
. So, thetext
property of the classPart
is not available as keyword...I have found no way to extract text from a streaming response other than using
repr
function and trimming it.Environment details
MacOS Sonoma 14.2
3.11.5
23.2.1
google-cloud-aiplatform
version:Steps to reproduce
gcloud
Code example
Stack trace