statmike / vertex-ai-mlops

Google Cloud Platform Vertex AI end-to-end workflows for machine learning operations
Apache License 2.0
450 stars 202 forks source link

getting error in creating custom job from local script for 05a Notebook #57

Open Jay2201 opened 6 months ago

Jay2201 commented 6 months ago

Hi Mike,

I am getting error while creating custom job from local, 05a - Vertex AI Custom Model - TensorFlow - Custom Job With Python File

Getting this error File "/root/.local/lib/python3.7/site-packages/google/cloud/aiplatform/compat/__init__.py", line 18, in <module> from google.cloud.aiplatform.compat import services File "/root/.local/lib/python3.7/site-packages/google/cloud/aiplatform/compat/services/__init__.py", line 18, in <module> from google.cloud.aiplatform_v1beta1.services.dataset_service import ( File "/root/.local/lib/python3.7/site-packages/google/cloud/aiplatform_v1beta1/__init__.py", line 21, in <module> from .services.dataset_service import DatasetServiceClient File "/root/.local/lib/python3.7/site-packages/google/cloud/aiplatform_v1beta1/services/dataset_service/__init__.py", line 16, in <module> from .client import DatasetServiceClient File "/root/.local/lib/python3.7/site-packages/google/cloud/aiplatform_v1beta1/services/dataset_service/client.py", line 51, in <module> from google.cloud.aiplatform_v1beta1.services.dataset_service import pagers File "/root/.local/lib/python3.7/site-packages/google/cloud/aiplatform_v1beta1/services/dataset_service/pagers.py", line 27, in <module> from google.cloud.aiplatform_v1beta1.types import annotation File "/root/.local/lib/python3.7/site-packages/google/cloud/aiplatform_v1beta1/types/__init__.py", line 184, in <module> from .feature_online_store_admin_service import ( File "/root/.local/lib/python3.7/site-packages/google/cloud/aiplatform_v1beta1/types/feature_online_store_admin_service.py", line 26, in <module> from google.cloud.aiplatform_v1beta1.types import ( File "/root/.local/lib/python3.7/site-packages/google/cloud/aiplatform_v1beta1/types/feature_view_sync.py", line 24, in <module> from google.type import interval_pb2 # type: ignore ImportError: cannot import name 'interval_pb2' from 'google.type' (/opt/conda/lib/python3.7/site-packages/google/type/__init__.py)

statmike commented 6 months ago

What version of google.cloud.aiplatform are you using?

from google.cloud import aiplatform
aiplatform.__version__
Jay2201 commented 6 months ago

I am using 1.36.4.

statmike commented 6 months ago

@Jay2201 I figured it out. Thank you so much for bringing this to my attention.

The training and serving container need to be updated because the google.cloud.aiplatform package started requiring Python 3.10. I will make this update to the entire series of notebooks and push them as I test. If you want it to work quickly then change these two variables under setup:

TRAIN_IMAGE = 'us-docker.pkg.dev/vertex-ai/training/tf-cpu.2-11.py310:latest'
DEPLOY_IMAGE ='us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-11:latest'

and add db-dtypes under the requirements:

customJob = aiplatform.CustomJob.from_local_script(
    display_name = f'{SERIES}_{EXPERIMENT}_{TIMESTAMP}',
    script_path = SCRIPT_PATH,
    container_uri = TRAIN_IMAGE,
    args = CMDARGS,
    requirements = ['tensorflow_io', f'google-cloud-aiplatform>={aiplatform.__version__}', 'db-dtypes', f"protobuf=={pkg_resources.get_distribution('protobuf').version}"],
    replica_count = 1,
    machine_type = TRAIN_COMPUTE,
    accelerator_count = 0,
    base_output_dir = f"{URI}/models/{TIMESTAMP}",
    staging_bucket = f"{URI}/models/{TIMESTAMP}",
    labels = {'series' : f'{SERIES}', 'experiment' : f'{EXPERIMENT}', 'experiment_name' : f'{EXPERIMENT_NAME}', 'run_name' : f'{RUN_NAME}'}
)
Jay2201 commented 6 months ago

Yeah this helped thanks alot Mike

statmike commented 6 months ago

Thank you for confirming @Jay2201. I am going to reopen this until I completely update the impact across the 05 Series. So far I have 05 in notebook, and 05a-05f updated. I have also been doing a number of other updates as I do this to make the notebooks clearer and more helpful.