GoogleCloudPlatform / data-science-on-gcp

Source code accompanying book: Data Science on the Google Cloud Platform, Valliappa Lakshmanan, O'Reilly 2017
Apache License 2.0
1.31k stars 715 forks source link

Crash during Deployment of model on Vertex AI #154

Open shinchri opened 2 years ago

shinchri commented 2 years ago

Chapter 9 within "Deploy model to Vertex AI" section in flights_model_tf2.ipynb. The deployment crashes when you try to execute the first cell:

...
# upload model
gcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \
     --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \
     --artifact-uri=$EXPORT_PATH
MODEL_ID=$(gcloud ai models list --region=$REGION --format='value(MODEL_ID)' --filter=display_name=${MODEL_NAME})
echo "MODEL_ID=$MODEL_ID"

# deploy model to endpoint
gcloud ai endpoints deploy-model $ENDPOINT_ID \
  --region=$REGION \
  --model=$MODEL_ID \
  --display-name=$MODEL_NAME \
  --machine-type=n1-standard-2 \
  --min-replica-count=1 \
  --max-replica-count=1 \
  --traffic-split=0=100

When I check the Vertex Endpoints, one was created but something else seems to have gone wrong.

Output: gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-222758/ Creating Endpoint for flights-20220803-223154 ENDPOINT_ID=974809417000157184 MODEL_ID=

followed by very long error (the error was too long so I pasted part of it):

Using endpoint [https://us-central1-aiplatform.googleapis.com/]
WARNING: The following filter keys were not present in any resource : display_name
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
Waiting for operation [7706081518493368320]...
.....done.
Created Vertex AI endpoint: projects/591020730428/locations/us-central1/endpoints/974809417000157184.
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
ERROR: gcloud crashed (InvalidDataFromServerError): Error decoding response "{
  "models": [
    {
      "name": "projects/591020730428/locations/us-central1/models/1316788319564070912",
      "displayName": "flights-20220803-223002",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:30:12.377079Z",
      "updateTime": "2022-08-03T22:30:14.993220Z",
      "etag": "AMEw9yOIRZqfqqO_ngaA77Jw8Fs9E_kcI8tkqAIsTzFViX-aIrRbHfc0d2HRBihT32rp",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
...

If you would like to report this issue, please run the following command:
  gcloud feedback

To check gcloud for common problems, please run the following command:
  gcloud info --run-diagnostics
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
ERROR: (gcloud.ai.endpoints.deploy-model) could not parse resource []
---------------------------------------------------------------------------
CalledProcessError                        Traceback (most recent call last)
/tmp/ipykernel_1/3503756464.py in <module>
----> 1 get_ipython().run_cell_magic('bash', '', '# note TF_VERSION and ENDPOINT_NAME set in 1st cell\n# TF_VERSION=2-6\n# ENDPOINT_NAME=flights\n\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n\nif [[ $(gcloud ai endpoints list --region=$REGION \\\n        --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n    echo "Endpoint for $MODEL_NAME already exists"\nelse\n    # create model\n    echo "Creating Endpoint for $MODEL_NAME"\n    gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\n\nENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \\\n              --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n    echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n    gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n\n# upload model\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n     --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n     --artifact-uri=$EXPORT_PATH\nMODEL_ID=$(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n\n# deploy model to endpoint\ngcloud ai endpoints deploy-model $ENDPOINT_ID \\\n  --region=$REGION \\\n  --model=$MODEL_ID \\\n  --display-name=$MODEL_NAME \\\n  --machine-type=n1-standard-2 \\\n  --min-replica-count=1 \\\n  --max-replica-count=1 \\\n  --traffic-split=0=100\n')

/opt/conda/lib/python3.7/site-packages/IPython/core/interactiveshell.py in run_cell_magic(self, magic_name, line, cell)
   2470             with self.builtin_trap:
   2471                 args = (magic_arg_s, cell)
-> 2472                 result = fn(*args, **kwargs)
   2473             return result
   2474 

/opt/conda/lib/python3.7/site-packages/IPython/core/magics/script.py in named_script_magic(line, cell)
    140             else:
    141                 line = script
--> 142             return self.shebang(line, cell)
    143 
    144         # write a basic docstring:

/opt/conda/lib/python3.7/site-packages/decorator.py in fun(*args, **kw)
    230             if not kwsyntax:
    231                 args, kw = fix(args, kw, sig)
--> 232             return caller(func, *(extras + args), **kw)
    233     fun.__name__ = func.__name__
    234     fun.__doc__ = func.__doc__

/opt/conda/lib/python3.7/site-packages/IPython/core/magic.py in <lambda>(f, *a, **k)
    185     # but it's overkill for just that one bit of state.
    186     def magic_deco(arg):
--> 187         call = lambda f, *a, **k: f(*a, **k)
    188 
    189         if callable(arg):

/opt/conda/lib/python3.7/site-packages/IPython/core/magics/script.py in shebang(self, line, cell)
    243             sys.stderr.flush()
    244         if args.raise_error and p.returncode!=0:
--> 245             raise CalledProcessError(p.returncode, cell, output=out, stderr=err)
    246 
    247     def _run_script(self, p, cell, to_close):

CalledProcessError: Command 'b'# note TF_VERSION and ENDPOINT_NAME set in 1st cell\n# TF_VERSION=2-6\n# ENDPOINT_NAME=flights\n\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n\nif [[ $(gcloud ai endpoints list --region=$REGION \\\n        --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n    echo "Endpoint for $MODEL_NAME already exists"\nelse\n    # create model\n    echo "Creating Endpoint for $MODEL_NAME"\n    gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\n\nENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \\\n              --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n    echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n    gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n\n# upload model\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n     --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n     --artifact-uri=$EXPORT_PATH\nMODEL_ID=$(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n\n# deploy model to endpoint\ngcloud ai endpoints deploy-model $ENDPOINT_ID \\\n  --region=$REGION \\\n  --model=$MODEL_ID \\\n  --display-name=$MODEL_NAME \\\n  --machine-type=n1-standard-2 \\\n  --min-replica-count=1 \\\n  --max-replica-count=1 \\\n  --traffic-split=0=100\n'' returned non-zero exit status 1.
lakshmanok commented 2 years ago

You seem to have a mismatch in tensor flow versions between the notebook and docker image.

I see both 2-6 and 2-9 referenced in the error log. Could you check?

thanks, Lak

On Wed, Aug 3, 2022, 3:41 PM Chris Shin @.***> wrote:

Chapter 9 within "Deploy model to Vertex AI" section in flights_model_tf2.ipynb. The deployment crashes when you try to execute the first cell:

...

upload model

gcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \ --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest http://us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.$%7BTF_VERSION%7D:latest \ --artifact-uri=$EXPORT_PATH MODEL_ID=$(gcloud ai models list --region=$REGION --format='value(MODEL_ID)' --filter=display_name=${MODEL_NAME}) echo "MODEL_ID=$MODEL_ID"

deploy model to endpoint

gcloud ai endpoints deploy-model $ENDPOINT_ID \ --region=$REGION \ --model=$MODEL_ID \ --display-name=$MODEL_NAME \ --machine-type=n1-standard-2 \ --min-replica-count=1 \ --max-replica-count=1 \ --traffic-split=0=100

When I check the Vertex Endpoints, one was created but something else seems to have gone wrong.

Output: gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-222758/ Creating Endpoint for flights-20220803-223154 ENDPOINT_ID=974809417000157184 MODEL_ID=

followed by very long error (the error was too long so I pasted part of it):

Using endpoint [https://us-central1-aiplatform.googleapis.com/] WARNING: The following filter keys were not present in any resource : display_name Using endpoint [https://us-central1-aiplatform.googleapis.com/] Waiting for operation [7706081518493368320]... .....done. Created Vertex AI endpoint: projects/591020730428/locations/us-central1/endpoints/974809417000157184. Using endpoint [https://us-central1-aiplatform.googleapis.com/] Using endpoint [https://us-central1-aiplatform.googleapis.com/] ERROR: gcloud crashed (InvalidDataFromServerError): Error decoding response "{ "models": [ { "name": "projects/591020730428/locations/us-central1/models/1316788319564070912", "displayName": "flights-20220803-223002", "predictSchemata": {}, "containerSpec": { "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest" }, "supportedDeploymentResourcesTypes": [ "DEDICATED_RESOURCES" ], "supportedInputStorageFormats": [ "jsonl", "bigquery", "csv", "tf-record", "tf-record-gzip", "file-list" ], "supportedOutputStorageFormats": [ "jsonl", "bigquery" ], "createTime": "2022-08-03T22:30:12.377079Z", "updateTime": "2022-08-03T22:30:14.993220Z", "etag": "AMEw9yOIRZqfqqO_ngaA77Jw8Fs9E_kcI8tkqAIsTzFViX-aIrRbHfc0d2HRBihT32rp", "supportedExportFormats": [ { "id": "custom-trained", "exportableContents": [ "ARTIFACT" ] } ], ...

If you would like to report this issue, please run the following command: gcloud feedback

To check gcloud for common problems, please run the following command: gcloud info --run-diagnostics Using endpoint [https://us-central1-aiplatform.googleapis.com/] ERROR: (gcloud.ai.endpoints.deploy-model) could not parse resource []

CalledProcessError Traceback (most recent call last) /tmp/ipykernel_1/3503756464.py in ----> 1 get_ipython().run_cell_magic('bash', '', '# note TF_VERSION and ENDPOINT_NAME set in 1st cell\n# TF_VERSION=2-6\n# ENDPOINT_NAME=flights\n\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n\nif [[ $(gcloud ai endpoints list --region=$REGION \\n --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n echo "Endpoint for $MODEL_NAME already exists"\nelse\n # create model\n echo "Creating Endpoint for $MODEL_NAME"\n gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\n\nENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \\n --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n\n# upload model\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\n --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest http://us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.$%7BTF_VERSION%7D:latest \\n --artifact-uri=$EXPORT_PATH\nMODEL_ID=$(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n\n# deploy model to endpoint\ngcloud ai endpoints deploy-model $ENDPOINT_ID \\n --region=$REGION \\n --model=$MODEL_ID \\n --display-name=$MODEL_NAME \\n --machine-type=n1-standard-2 \\n --min-replica-count=1 \\n --max-replica-count=1 \\n --traffic-split=0=100\n')

/opt/conda/lib/python3.7/site-packages/IPython/core/interactiveshell.py in run_cell_magic(self, magic_name, line, cell) 2470 with self.builtin_trap: 2471 args = (magic_arg_s, cell) -> 2472 result = fn(*args, **kwargs) 2473 return result 2474

/opt/conda/lib/python3.7/site-packages/IPython/core/magics/script.py in named_script_magic(line, cell) 140 else: 141 line = script --> 142 return self.shebang(line, cell) 143 144 # write a basic docstring:

/opt/conda/lib/python3.7/site-packages/decorator.py in fun(*args, *kw) 230 if not kwsyntax: 231 args, kw = fix(args, kw, sig) --> 232 return caller(func, (extras + args), **kw) 233 fun.name = func.name 234 fun.doc = func.doc

/opt/conda/lib/python3.7/site-packages/IPython/core/magic.py in (f, *a, k) 185 # but it's overkill for just that one bit of state. 186 def magic_deco(arg): --> 187 call = lambda f, *a, *k: f(a, k) 188 189 if callable(arg):

/opt/conda/lib/python3.7/site-packages/IPython/core/magics/script.py in shebang(self, line, cell) 243 sys.stderr.flush() 244 if args.raise_error and p.returncode!=0: --> 245 raise CalledProcessError(p.returncode, cell, output=out, stderr=err) 246 247 def _run_script(self, p, cell, to_close):

CalledProcessError: Command 'b'# note TF_VERSION and ENDPOINT_NAME set in 1st cell\n# TF_VERSION=2-6\n# ENDPOINT_NAME=flights\n\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n\nif [[ $(gcloud ai endpoints list --region=$REGION \\n --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n echo "Endpoint for $MODEL_NAME already exists"\nelse\n # create model\n echo "Creating Endpoint for $MODEL_NAME"\n gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\n\nENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \\n --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n\n# upload model\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\n --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest http://us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.$%7BTF_VERSION%7D:latest \\n --artifact-uri=$EXPORT_PATH\nMODEL_ID=$(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n\n# deploy model to endpoint\ngcloud ai endpoints deploy-model $ENDPOINT_ID \\n --region=$REGION \\n --model=$MODEL_ID \\n --display-name=$MODEL_NAME \\n --machine-type=n1-standard-2 \\n --min-replica-count=1 \\n --max-replica-count=1 \\n --traffic-split=0=100\n'' returned non-zero exit status 1.

— Reply to this email directly, view it on GitHub https://github.com/GoogleCloudPlatform/data-science-on-gcp/issues/154, or unsubscribe https://github.com/notifications/unsubscribe-auth/AANJPZ7HJN32NDLY3ZZC2S3VXLYRLANCNFSM55QPAVDA . You are receiving this because you are subscribed to this thread.Message ID: @.***>

shinchri commented 2 years ago

@lakshmanok 2-6 was part of the comment inside the bash. The version I am using is 2.9.0-rc2.

After os.environ['TF_VERSION']='2-' + tf.__version__[2:3], the TF_VERSION is 2-9. I tried hard coding the version to "2-6" or "2.9.0-rc2" as well, but none worked.

I am only using one version so I don't know where the other versions are coming from.

The complete error message is:

click to see entire error message
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
WARNING: The following filter keys were not present in any resource : display_name
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
Waiting for operation [476678216656879616]...
.....done.
Created Vertex AI endpoint: projects/591020730428/locations/us-central1/endpoints/2473382193007689728.
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
ERROR: gcloud crashed (InvalidDataFromServerError): Error decoding response "{
  "models": [
    {
      "name": "projects/591020730428/locations/us-central1/models/9103512075287658496",
      "displayName": "flights-20220804-012641",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2.9.0-rc2:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-04T01:26:50.402637Z",
      "updateTime": "2022-08-04T01:26:52.926706Z",
      "etag": "AMEw9yM4rd6hO-VtoplqIeRMGz4t2Qa_JhGFUrlvxwDoGkPy8LWau6DkyWWf0VBAms9Y",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-012017/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-04T01:26:50.402637Z",
      "versionUpdateTime": "2022-08-04T01:26:52.926706Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/842784458783326208",
      "displayName": "flights-20220804-012530",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-6:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES",
        "SHARED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-04T01:25:39.057762Z",
      "updateTime": "2022-08-04T01:25:41.645650Z",
      "etag": "AMEw9yOJYffsNNc6kJ8mW3bmtVQhhrTYoYJIngsLlvxhdhjclPNdo-cDTJyrsPqI0W0B",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-012017/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-04T01:25:39.057762Z",
      "versionUpdateTime": "2022-08-04T01:25:41.645650Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/8432475730809454592",
      "displayName": "flights-20220804-012417",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-6:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES",
        "SHARED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-04T01:24:26.339107Z",
      "updateTime": "2022-08-04T01:24:28.949204Z",
      "etag": "AMEw9yOBQ4swa06T9UIAMDEO9iI7SKKcSi26CgPQ3pjBh8CHgd36cNk1OZD1-gg0MUY1",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-012017/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-04T01:24:26.339107Z",
      "versionUpdateTime": "2022-08-04T01:24:28.949204Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/8810778099508576256",
      "displayName": "flights-20220804-012255",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-6:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES",
        "SHARED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-04T01:23:04.426138Z",
      "updateTime": "2022-08-04T01:23:06.954014Z",
      "etag": "AMEw9yPd43l0ruUmuukBzwK4j8mfL312w_QfK_XeQQCj77lwK3xmApR3i0RRYxT74pH9",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-012017/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-04T01:23:04.426138Z",
      "versionUpdateTime": "2022-08-04T01:23:06.954014Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/3820789712382066688",
      "displayName": "flights-20220804-012133",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-6:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES",
        "SHARED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-04T01:21:39.996004Z",
      "updateTime": "2022-08-04T01:21:42.602720Z",
      "etag": "AMEw9yO2F-RCkQUJE4gkcZ6EYkiHday9IsNsBJEl7yCpumOoN2nj4M0RURvD7dFgFg9y",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-012017/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-04T01:21:39.996004Z",
      "versionUpdateTime": "2022-08-04T01:21:42.602720Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/738075767446962176",
      "displayName": "flights-20220804-012101",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-04T01:21:08.743501Z",
      "updateTime": "2022-08-04T01:21:11.421906Z",
      "etag": "AMEw9yNtBvF2fcwdOSdeqc9D1iPWHKikNFJMgQaDaicyW_YlxcbQodkjobhyRVubZF0E",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-012017/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-04T01:21:08.743501Z",
      "versionUpdateTime": "2022-08-04T01:21:11.421906Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/3622631328777764864",
      "displayName": "flights-20220803-223154",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:32:04.481570Z",
      "updateTime": "2022-08-03T22:32:07.084890Z",
      "etag": "AMEw9yM1z5M_3p1RJ3E-Xdm0q8zAALfdkLk0xr8ydSFn_TaVEFb3bCcERIXpKu1S6Ag=",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-222758/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:32:04.481570Z",
      "versionUpdateTime": "2022-08-03T22:32:07.084890Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/1316788319564070912",
      "displayName": "flights-20220803-223002",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:30:12.377079Z",
      "updateTime": "2022-08-03T22:30:14.993220Z",
      "etag": "AMEw9yOMfSljOWQw5aAZ7WKrLwmFQDgLqkiBBoMXg4s3MyG_5x9Qnv4BhoB5sM3dAxjy",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-222758/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:30:12.377079Z",
      "versionUpdateTime": "2022-08-03T22:30:14.993220Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/4988910865731289088",
      "displayName": "flights-20220803-222904",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:29:11.348035Z",
      "updateTime": "2022-08-03T22:29:13.961294Z",
      "etag": "AMEw9yMsc4_EAziRd8lPcZyeor2W90ShMn0WfFLP4_Axane_7HhdLArIKJGN6kKAdZdD",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-222758/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:29:11.348035Z",
      "versionUpdateTime": "2022-08-03T22:29:13.961294Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/4445664160679723008",
      "displayName": "flights-20220803-222609",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:26:16.737076Z",
      "updateTime": "2022-08-03T22:26:19.654852Z",
      "etag": "AMEw9yO6zVO9G5Vo8QM5n3Ao3s3n6qCCTklmLWlSIwgFOaD-PXqGbfoi5h1DodIlAlox",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-212637/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:26:16.737076Z",
      "versionUpdateTime": "2022-08-03T22:26:19.654852Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/7088151242039361536",
      "displayName": "flights-20220803-222539",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:25:47.111845Z",
      "updateTime": "2022-08-03T22:25:49.675406Z",
      "etag": "AMEw9yOM6BuUFV_OLwkVdELeNw3i1HHVIMB0nqKJ0ZZV6P9xbpkiGy-xbYNQ4B87l5Ys",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-212637/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:25:47.111845Z",
      "versionUpdateTime": "2022-08-03T22:25:49.675406Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/377224847303901184",
      "displayName": "flights-20220803-222459",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:25:07.431618Z",
      "updateTime": "2022-08-03T22:25:10.129002Z",
      "etag": "AMEw9yNW87y8hoSOnoReSw0BiZoaUAIv7tp1ph1d1MRd3vAUQkLsIbR77bmTAOchw_U=",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-212637/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:25:07.431618Z",
      "versionUpdateTime": "2022-08-03T22:25:10.129002Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/3915365304556847104",
      "displayName": "flights-20220803-222330",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-6:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES",
        "SHARED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:23:37.884370Z",
      "updateTime": "2022-08-03T22:23:40.524363Z",
      "etag": "AMEw9yMVm_CE9bftw-qceZk8LeHaNhdLx-cgTA5PjZ3e07_BFqwRwgPd3ty_3xlnE82v",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-212637/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:23:37.884370Z",
      "versionUpdateTime": "2022-08-03T22:23:40.524363Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/6414863097747472384",
      "displayName": "flights-20220803-221346",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:14:03.490710Z",
      "updateTime": "2022-08-03T22:14:05.659596Z",
      "etag": "AMEw9yPFHHccVFPzhIe7ZluZ7E39sSJF4ViqODAl1jpewlOLy-UJjQcFTIbNk3yiJtmT",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-212637/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:14:03.490710Z",
      "versionUpdateTime": "2022-08-03T22:14:05.659596Z"
    }
  ]
}
" as type GoogleCloudAiplatformV1ListModelsResponse: Repeated values for field supportedDeploymentResourcesTypes may not be None

If you would like to report this issue, please run the following command:
  gcloud feedback

To check gcloud for common problems, please run the following command:
  gcloud info --run-diagnostics
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
Waiting for operation [4491637284457676800]...
.................done.
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
ERROR: gcloud crashed (InvalidDataFromServerError): Error decoding response "{
  "models": [
    {
      "name": "projects/591020730428/locations/us-central1/models/96312820546666496",
      "displayName": "flights-20220804-013301",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-04T01:33:10.171939Z",
      "updateTime": "2022-08-04T01:33:12.776819Z",
      "etag": "AMEw9yOCbuBK63h94z4a6QxtkSxu8bUmvfYbplO6UrrL58i1_DsVtLqEQIUFFo9-_Ghi",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-013242/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-04T01:33:10.171939Z",
      "versionUpdateTime": "2022-08-04T01:33:12.776819Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/9103512075287658496",
      "displayName": "flights-20220804-012641",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2.9.0-rc2:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-04T01:26:50.402637Z",
      "updateTime": "2022-08-04T01:26:52.926706Z",
      "etag": "AMEw9yOjYl7oco3RqtEusWtpce6--5bI4DAsHRdIUbpcPhvTmF58PhSFiHMzUKXU609g",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-012017/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-04T01:26:50.402637Z",
      "versionUpdateTime": "2022-08-04T01:26:52.926706Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/842784458783326208",
      "displayName": "flights-20220804-012530",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-6:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES",
        "SHARED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-04T01:25:39.057762Z",
      "updateTime": "2022-08-04T01:25:41.645650Z",
      "etag": "AMEw9yMuhhglqjf7bs-WXLBDQKQaLq0Xk8rURA2WEWaB1TcHxJKgmGVJ3Kv6kidotPp7",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-012017/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-04T01:25:39.057762Z",
      "versionUpdateTime": "2022-08-04T01:25:41.645650Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/8432475730809454592",
      "displayName": "flights-20220804-012417",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-6:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES",
        "SHARED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-04T01:24:26.339107Z",
      "updateTime": "2022-08-04T01:24:28.949204Z",
      "etag": "AMEw9yNVmNnvgY3isGkEb-vY66FKRPB9sRusJxvqPiu_5lBVJOtsmvlpJAOAPm8pSs2r",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-012017/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-04T01:24:26.339107Z",
      "versionUpdateTime": "2022-08-04T01:24:28.949204Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/8810778099508576256",
      "displayName": "flights-20220804-012255",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-6:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES",
        "SHARED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-04T01:23:04.426138Z",
      "updateTime": "2022-08-04T01:23:06.954014Z",
      "etag": "AMEw9yPOlQQsPbSDqF2hR4FQJ0KTR4iqsrCgvT9LJN3MWwlJo4ACSzoC4Mkmjoj1ZpaC",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-012017/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-04T01:23:04.426138Z",
      "versionUpdateTime": "2022-08-04T01:23:06.954014Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/3820789712382066688",
      "displayName": "flights-20220804-012133",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-6:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES",
        "SHARED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-04T01:21:39.996004Z",
      "updateTime": "2022-08-04T01:21:42.602720Z",
      "etag": "AMEw9yNd5w4pV05qq0hymVKu4QY3R6i3tyLvtEgAfgR9hWUJ3qQEpTeoyDMJN_5k37oK",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-012017/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-04T01:21:39.996004Z",
      "versionUpdateTime": "2022-08-04T01:21:42.602720Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/738075767446962176",
      "displayName": "flights-20220804-012101",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-04T01:21:08.743501Z",
      "updateTime": "2022-08-04T01:21:11.421906Z",
      "etag": "AMEw9yOIDT_twcwlEAk9SJ_uDC44JhNzYTiu6e-Mg0gOimzUpknc6kWBmqxzAhW4-7E7",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-012017/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-04T01:21:08.743501Z",
      "versionUpdateTime": "2022-08-04T01:21:11.421906Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/3622631328777764864",
      "displayName": "flights-20220803-223154",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:32:04.481570Z",
      "updateTime": "2022-08-03T22:32:07.084890Z",
      "etag": "AMEw9yN62Xe0wnLagCf7fTaHzG6rbo8EKpB0lIqKO3JVzlDJN1rEb9zLH4eHsp-0lJk=",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-222758/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:32:04.481570Z",
      "versionUpdateTime": "2022-08-03T22:32:07.084890Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/1316788319564070912",
      "displayName": "flights-20220803-223002",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:30:12.377079Z",
      "updateTime": "2022-08-03T22:30:14.993220Z",
      "etag": "AMEw9yO6ubkb9WbTSAi1K1DfBMVHAt-pKJ8UCkSij-gW8ExrWsKLHhaokVD7dZV07U2E",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-222758/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:30:12.377079Z",
      "versionUpdateTime": "2022-08-03T22:30:14.993220Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/4988910865731289088",
      "displayName": "flights-20220803-222904",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:29:11.348035Z",
      "updateTime": "2022-08-03T22:29:13.961294Z",
      "etag": "AMEw9yMD4O81h3HVrMUSzoRku4BaN8Td8EDwUBsp9bGZUu9H9vb_Aae7QUrV99vinWTT",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-222758/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:29:11.348035Z",
      "versionUpdateTime": "2022-08-03T22:29:13.961294Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/4445664160679723008",
      "displayName": "flights-20220803-222609",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:26:16.737076Z",
      "updateTime": "2022-08-03T22:26:19.654852Z",
      "etag": "AMEw9yMxqTeQvMxLe8ADkN61tZHxrecnyOQZq5rFSRQ-MPMw7hL8xXuSUZn6biiFb7If",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-212637/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:26:16.737076Z",
      "versionUpdateTime": "2022-08-03T22:26:19.654852Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/7088151242039361536",
      "displayName": "flights-20220803-222539",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:25:47.111845Z",
      "updateTime": "2022-08-03T22:25:49.675406Z",
      "etag": "AMEw9yNCj4O7ymtGrFBkBNmY6hNQ5aFLH_jRELpdtPM68eyYaKXaeSBztEkCK5XoDKcO",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-212637/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:25:47.111845Z",
      "versionUpdateTime": "2022-08-03T22:25:49.675406Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/377224847303901184",
      "displayName": "flights-20220803-222459",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:25:07.431618Z",
      "updateTime": "2022-08-03T22:25:10.129002Z",
      "etag": "AMEw9yMpqzewL95pvf4uUXqfV1C2Tyzg-lI-0CS1l23dnXGgCuCA7zL_hgJ2TigeIWQ=",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-212637/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:25:07.431618Z",
      "versionUpdateTime": "2022-08-03T22:25:10.129002Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/3915365304556847104",
      "displayName": "flights-20220803-222330",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-6:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES",
        "SHARED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:23:37.884370Z",
      "updateTime": "2022-08-03T22:23:40.524363Z",
      "etag": "AMEw9yOPNa9CqLX8zmkSrxSuUb14pUCiQwXFbEqWCjEfrwUJ-IJfmgKw-HU4e7iiN7Av",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-212637/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:23:37.884370Z",
      "versionUpdateTime": "2022-08-03T22:23:40.524363Z"
    },
    {
      "name": "projects/591020730428/locations/us-central1/models/6414863097747472384",
      "displayName": "flights-20220803-221346",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-03T22:14:03.490710Z",
      "updateTime": "2022-08-03T22:14:05.659596Z",
      "etag": "AMEw9yOAGZ7o-sld7iqDg1utX90aP0nOV96Eoz3J60GAFoUO_JorDGQ3eg9Iur6swTJe",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220803-212637/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-03T22:14:03.490710Z",
      "versionUpdateTime": "2022-08-03T22:14:05.659596Z"
    }
  ]
}
" as type GoogleCloudAiplatformV1ListModelsResponse: Repeated values for field supportedDeploymentResourcesTypes may not be None

If you would like to report this issue, please run the following command:
  gcloud feedback

To check gcloud for common problems, please run the following command:
  gcloud info --run-diagnostics
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
ERROR: (gcloud.ai.endpoints.deploy-model) could not parse resource []
---------------------------------------------------------------------------
CalledProcessError                        Traceback (most recent call last)
/tmp/ipykernel_1/2211222481.py in 
----> 1 get_ipython().run_cell_magic('bash', '', '\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n\nif [[ $(gcloud ai endpoints list --region=$REGION \\\n        --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n    echo "Endpoint for $MODEL_NAME already exists"\nelse\n    # create model\n    echo "Creating Endpoint for $MODEL_NAME"\n    gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\n\nENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \\\n              --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n    echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n    gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n\n# upload model\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n     --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n     --artifact-uri=$EXPORT_PATH\nMODEL_ID=$(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n\n# deploy model to endpoint\ngcloud ai endpoints deploy-model $ENDPOINT_ID \\\n  --region=$REGION \\\n  --model=$MODEL_ID \\\n  --display-name=$MODEL_NAME \\\n  --machine-type=n1-standard-2 \\\n  --min-replica-count=1 \\\n  --max-replica-count=1 \\\n  --traffic-split=0=100\n')

/opt/conda/lib/python3.7/site-packages/IPython/core/interactiveshell.py in run_cell_magic(self, magic_name, line, cell)
   2470             with self.builtin_trap:
   2471                 args = (magic_arg_s, cell)
-> 2472                 result = fn(*args, **kwargs)
   2473             return result
   2474 

/opt/conda/lib/python3.7/site-packages/IPython/core/magics/script.py in named_script_magic(line, cell)
    140             else:
    141                 line = script
--> 142             return self.shebang(line, cell)
    143 
    144         # write a basic docstring:

/opt/conda/lib/python3.7/site-packages/decorator.py in fun(*args, **kw)
    230             if not kwsyntax:
    231                 args, kw = fix(args, kw, sig)
--> 232             return caller(func, *(extras + args), **kw)
    233     fun.__name__ = func.__name__
    234     fun.__doc__ = func.__doc__

/opt/conda/lib/python3.7/site-packages/IPython/core/magic.py in (f, *a, **k)
    185     # but it's overkill for just that one bit of state.
    186     def magic_deco(arg):
--> 187         call = lambda f, *a, **k: f(*a, **k)
    188 
    189         if callable(arg):

/opt/conda/lib/python3.7/site-packages/IPython/core/magics/script.py in shebang(self, line, cell)
    243             sys.stderr.flush()
    244         if args.raise_error and p.returncode!=0:
--> 245             raise CalledProcessError(p.returncode, cell, output=out, stderr=err)
    246 
    247     def _run_script(self, p, cell, to_close):

CalledProcessError: Command 'b'\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n\nif [[ $(gcloud ai endpoints list --region=$REGION \\\n        --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n    echo "Endpoint for $MODEL_NAME already exists"\nelse\n    # create model\n    echo "Creating Endpoint for $MODEL_NAME"\n    gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\n\nENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \\\n              --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n    echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n    gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n\n# upload model\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n     --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n     --artifact-uri=$EXPORT_PATH\nMODEL_ID=$(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n\n# deploy model to endpoint\ngcloud ai endpoints deploy-model $ENDPOINT_ID \\\n  --region=$REGION \\\n  --model=$MODEL_ID \\\n  --display-name=$MODEL_NAME \\\n  --machine-type=n1-standard-2 \\\n  --min-replica-count=1 \\\n  --max-replica-count=1 \\\n  --traffic-split=0=100\n'' returned non-zero exit status 1.

shinchri commented 2 years ago

@lakshmanok I found out why I kept getting different versions. For some reason the code to delete any existing models doesn't work. The code to get MODEL_ID: $(gcloud ai models list --region=$REGION --format='value(MODEL_ID)' --filter=display_name=${MODEL_NAME}) doesn't seem to return anything. So I had to manually delete the models that were registered (and there were many).

Once I fixed that I stopped getting multiple models but I still had a problem. Here is the error code that I have now (I tried both 2-6 and 2-9 but they give the same error message but with different version):

Using endpoint [https://us-central1-aiplatform.googleapis.com/]
WARNING: The following filter keys were not present in any resource : display_name
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
Waiting for operation [1000784623292121088]...
.....done.
Created Vertex AI endpoint: projects/591020730428/locations/us-central1/endpoints/7596789719095050240.
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
WARNING: The following filter keys were not present in any resource : display_name
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
Waiting for operation [5612470641719508992]...
....................done.
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
ERROR: (gcloud.ai.endpoints.deploy-model) INVALID_ARGUMENT: Invalid image "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest" for deployment. Please use a Model with a valid image.
---------------------------------------------------------------------------
CalledProcessError                        Traceback (most recent call last)
/tmp/ipykernel_1/1002185504.py in <module>
----> 1 get_ipython().run_cell_magic('bash', '', '\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n\nif [[ $(gcloud ai endpoints list --region=$REGION \\\n        --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n    echo "Endpoint for $MODEL_NAME already exists"\nelse\n    # create model\n    echo "Creating Endpoint for $MODEL_NAME"\n    gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\n\nENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \\\n              --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n    echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n    gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n\n# upload model\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n     --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n     --artifact-uri=$EXPORT_PATH\nMODEL_ID=$(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n\n# deploy model to endpoint\ngcloud ai endpoints deploy-model $ENDPOINT_ID \\\n  --region=$REGION \\\n  --model=$MODEL_ID \\\n  --display-name=$MODEL_NAME \\\n  --machine-type=n1-standard-2 \\\n  --min-replica-count=1 \\\n  --max-replica-count=1 \\\n  --traffic-split=0=100\n\n')

/opt/conda/lib/python3.7/site-packages/IPython/core/interactiveshell.py in run_cell_magic(self, magic_name, line, cell)
   2470             with self.builtin_trap:
   2471                 args = (magic_arg_s, cell)
-> 2472                 result = fn(*args, **kwargs)
   2473             return result
   2474 

/opt/conda/lib/python3.7/site-packages/IPython/core/magics/script.py in named_script_magic(line, cell)
    140             else:
    141                 line = script
--> 142             return self.shebang(line, cell)
    143 
    144         # write a basic docstring:

/opt/conda/lib/python3.7/site-packages/decorator.py in fun(*args, **kw)
    230             if not kwsyntax:
    231                 args, kw = fix(args, kw, sig)
--> 232             return caller(func, *(extras + args), **kw)
    233     fun.__name__ = func.__name__
    234     fun.__doc__ = func.__doc__

/opt/conda/lib/python3.7/site-packages/IPython/core/magic.py in <lambda>(f, *a, **k)
    185     # but it's overkill for just that one bit of state.
    186     def magic_deco(arg):
--> 187         call = lambda f, *a, **k: f(*a, **k)
    188 
    189         if callable(arg):

/opt/conda/lib/python3.7/site-packages/IPython/core/magics/script.py in shebang(self, line, cell)
    243             sys.stderr.flush()
    244         if args.raise_error and p.returncode!=0:
--> 245             raise CalledProcessError(p.returncode, cell, output=out, stderr=err)
    246 
    247     def _run_script(self, p, cell, to_close):

CalledProcessError: Command 'b'\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n\nif [[ $(gcloud ai endpoints list --region=$REGION \\\n        --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n    echo "Endpoint for $MODEL_NAME already exists"\nelse\n    # create model\n    echo "Creating Endpoint for $MODEL_NAME"\n    gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\n\nENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \\\n              --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n    echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n    gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n\n# upload model\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n     --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n     --artifact-uri=$EXPORT_PATH\nMODEL_ID=$(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n\n# deploy model to endpoint\ngcloud ai endpoints deploy-model $ENDPOINT_ID \\\n  --region=$REGION \\\n  --model=$MODEL_ID \\\n  --display-name=$MODEL_NAME \\\n  --machine-type=n1-standard-2 \\\n  --min-replica-count=1 \\\n  --max-replica-count=1 \\\n  --traffic-split=0=100\n\n'' returned non-zero exit status 1.

The problem seems to be the docker image, but not sure where the image is coming from.

Thanks, Chris

lakshmanok commented 2 years ago

These are the supported docker images in 2.9 Is the name that the script generates correct?

thanks, Lak

On Wed, Aug 3, 2022, 7:35 PM Chris Shin @.***> wrote:

@lakshmanok https://github.com/lakshmanok I found out why I kept getting different versions. For some reason the code to delete any existing models doesn't work. So I had to manually delete the models that were registered (and there were many).

Once I fixed that I stopped getting multiple models but I still had a problem. Here is the error code that I have now (I tried both 2-6 and 2-9 but they give the same error message but with different version):

Using endpoint [https://us-central1-aiplatform.googleapis.com/] WARNING: The following filter keys were not present in any resource : display_name Using endpoint [https://us-central1-aiplatform.googleapis.com/] Waiting for operation [1000784623292121088]... .....done. Created Vertex AI endpoint: projects/591020730428/locations/us-central1/endpoints/7596789719095050240. Using endpoint [https://us-central1-aiplatform.googleapis.com/] Using endpoint [https://us-central1-aiplatform.googleapis.com/] WARNING: The following filter keys were not present in any resource : display_name Using endpoint [https://us-central1-aiplatform.googleapis.com/] Waiting for operation [5612470641719508992]... ....................done. Using endpoint [https://us-central1-aiplatform.googleapis.com/] Using endpoint [https://us-central1-aiplatform.googleapis.com/] ERROR: (gcloud.ai.endpoints.deploy-model) INVALID_ARGUMENT: Invalid image "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest" for deployment. Please use a Model with a valid image.

CalledProcessError Traceback (most recent call last) /tmp/ipykernel_1/1002185504.py in ----> 1 get_ipython().run_cell_magic('bash', '', '\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n\nif [[ $(gcloud ai endpoints list --region=$REGION \\n --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n echo "Endpoint for $MODEL_NAME already exists"\nelse\n # create model\n echo "Creating Endpoint for $MODEL_NAME"\n gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\n\nENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \\n --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n\n# upload model\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\n --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest http://us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.$%7BTF_VERSION%7D:latest \\n --artifact-uri=$EXPORT_PATH\nMODEL_ID=$(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n\n# deploy model to endpoint\ngcloud ai endpoints deploy-model $ENDPOINT_ID \\n --region=$REGION \\n --model=$MODEL_ID \\n --display-name=$MODEL_NAME \\n --machine-type=n1-standard-2 \\n --min-replica-count=1 \\n --max-replica-count=1 \\n --traffic-split=0=100\n\n')

/opt/conda/lib/python3.7/site-packages/IPython/core/interactiveshell.py in run_cell_magic(self, magic_name, line, cell) 2470 with self.builtin_trap: 2471 args = (magic_arg_s, cell) -> 2472 result = fn(*args, **kwargs) 2473 return result 2474

/opt/conda/lib/python3.7/site-packages/IPython/core/magics/script.py in named_script_magic(line, cell) 140 else: 141 line = script --> 142 return self.shebang(line, cell) 143 144 # write a basic docstring:

/opt/conda/lib/python3.7/site-packages/decorator.py in fun(*args, *kw) 230 if not kwsyntax: 231 args, kw = fix(args, kw, sig) --> 232 return caller(func, (extras + args), **kw) 233 fun.name = func.name 234 fun.doc = func.doc

/opt/conda/lib/python3.7/site-packages/IPython/core/magic.py in (f, *a, k) 185 # but it's overkill for just that one bit of state. 186 def magic_deco(arg): --> 187 call = lambda f, *a, *k: f(a, k) 188 189 if callable(arg):

/opt/conda/lib/python3.7/site-packages/IPython/core/magics/script.py in shebang(self, line, cell) 243 sys.stderr.flush() 244 if args.raise_error and p.returncode!=0: --> 245 raise CalledProcessError(p.returncode, cell, output=out, stderr=err) 246 247 def _run_script(self, p, cell, to_close):

CalledProcessError: Command 'b'\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n\nif [[ $(gcloud ai endpoints list --region=$REGION \\n --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n echo "Endpoint for $MODEL_NAME already exists"\nelse\n # create model\n echo "Creating Endpoint for $MODEL_NAME"\n gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\n\nENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \\n --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n\n# upload model\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\n --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest http://us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.$%7BTF_VERSION%7D:latest \\n --artifact-uri=$EXPORT_PATH\nMODEL_ID=$(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n\n# deploy model to endpoint\ngcloud ai endpoints deploy-model $ENDPOINT_ID \\n --region=$REGION \\n --model=$MODEL_ID \\n --display-name=$MODEL_NAME \\n --machine-type=n1-standard-2 \\n --min-replica-count=1 \\n --max-replica-count=1 \\n --traffic-split=0=100\n\n'' returned non-zero exit status 1.

The problem seems to be the docker image, but not sure where the image is coming from.

Thanks, Chris

— Reply to this email directly, view it on GitHub https://github.com/GoogleCloudPlatform/data-science-on-gcp/issues/154#issuecomment-1204687591, or unsubscribe https://github.com/notifications/unsubscribe-auth/AANJPZ7RYC7WNRSMF7CYWETVXMT5LANCNFSM55QPAVDA . You are receiving this because you were mentioned.Message ID: @.***>

shinchri commented 2 years ago

@lakshmanok Do you mean the model?

The name of the Model is "flights-20220804-023315" and the endpoint is named "flights".

I tried to deploy by going to Model Registry within Vertex AI and tried to manually deploy the model, but it still returns same error (actually this one uses different image but still doesn't work):

Failed to deploy model "flights-20220804-023315" to endpoint "flights".
Invalid image "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest" for deployment. Please use a Model with a valid image.

Response code: 400
Status: INVALID_ARGUMENT

Tracking number: c5109049753637665
lakshmanok commented 2 years ago

Chris,

Please see this page for tensorflow versions that are supported by vertex AI prediction:

https://cloud.google.com/vertex-ai/docs/predictions/pre-built-containers

Notice that 2.9 is not (yet?) supported. This is usually a matter of time. Perhaps a couple of weeks.

For now, the way to resolve this problem is to train and deploy using 2.8 or earlier. You can do this if you do this chapter in a Vertex AI notebook that is built for tensorflow 2.8.

thanks, Lak

On Wed, Aug 3, 2022, 7:43 PM Chris Shin @.***> wrote:

@lakshmanok https://github.com/lakshmanok Do you mean the model?

The name of the Model is "flights-20220804-023315" and the endpoint is named "flights".

I tried to deploy by going to Model Registry within Vertex AI and tried to manually deploy the model, but it still returns same error:

Failed to deploy model "flights-20220804-023315" to endpoint "flights". Invalid image "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest" for deployment. Please use a Model with a valid image.

Response code: 400 Status: INVALID_ARGUMENT

Tracking number: c5109049753637665

— Reply to this email directly, view it on GitHub https://github.com/GoogleCloudPlatform/data-science-on-gcp/issues/154#issuecomment-1204691686, or unsubscribe https://github.com/notifications/unsubscribe-auth/AANJPZYFX2XKDO5T42HXE6TVXMU4ZANCNFSM55QPAVDA . You are receiving this because you were mentioned.Message ID: @.***>

shinchri commented 2 years ago

@lakshmanok I tried using TensorFlow 2.8 but it still seem to cause issue.

This is what I tried: Instead of using TensorFlow 2 (Local) kernel, I changed it to Python (Local) since TensorFlow 2 only seem to have version 2.9.

On very top of the notebook i install TensorFlow 2.8 using !pip install "tensorflow>=2.8,<2.9" which downloads 2.8.0.

Until I get to this part:

%%bash

TIMESTAMP=$(date +%Y%m%d-%H%M%S)
MODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}
EXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)
echo $EXPORT_PATH

if [[ $(gcloud ai endpoints list --region=$REGION \
        --format='value(DISPLAY_NAME)' --filter=display_name=${ENDPOINT_NAME}) ]]; then
    echo "Endpoint for $MODEL_NAME already exists"
else
    # create model
    echo "Creating Endpoint for $MODEL_NAME"
    gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}
fi

ENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \
              --format='value(ENDPOINT_ID)' --filter=display_name=${ENDPOINT_NAME})
echo "ENDPOINT_ID=$ENDPOINT_ID"

# delete any existing models with this name
for MODEL_ID in $(gcloud ai models list --region=$REGION --format='value(MODEL_ID)' --filter=display_name=${MODEL_NAME}); do
    echo "Deleting existing $MODEL_NAME ... $MODEL_ID "
    gcloud ai models delete --region=$REGION $MODEL_ID
done

# upload model
gcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \
     --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \
     --artifact-uri=$EXPORT_PATH
MODEL_ID=$(gcloud ai models list --region=$REGION --format='value(MODEL_ID)' --filter=display_name=${MODEL_NAME})
echo "MODEL_ID=$MODEL_ID"

# deploy model to endpoint
gcloud ai endpoints deploy-model $ENDPOINT_ID \
  --region=$REGION \
  --model=$MODEL_ID \
  --display-name=$MODEL_NAME \
  --machine-type=n1-standard-2 \
  --min-replica-count=1 \
  --max-replica-count=1 \
  --traffic-split=0=100

The output is:

gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-141057/
Creating Endpoint for flights-20220804-141148
ENDPOINT_ID=5065203778559410176
MODEL_ID=

Followed by:

Using endpoint [https://us-central1-aiplatform.googleapis.com/]
WARNING: The following filter keys were not present in any resource : display_name
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
Waiting for operation [6652731786897915904]...
.....done.
Created Vertex AI endpoint: projects/591020730428/locations/us-central1/endpoints/5065203778559410176.
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
WARNING: The following filter keys were not present in any resource : display_name
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
Waiting for operation [8710876816606232576]...
.....................done.
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
ERROR: gcloud crashed (InvalidDataFromServerError): Error decoding response "{
  "models": [
    {
      "name": "projects/591020730428/locations/us-central1/models/4091005690024296448",
      "displayName": "flights-20220804-141148",
      "predictSchemata": {},
      "containerSpec": {
        "imageUri": "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-8:latest"
      },
      "supportedDeploymentResourcesTypes": [
        "DEDICATED_RESOURCES",
        "SHARED_RESOURCES"
      ],
      "supportedInputStorageFormats": [
        "jsonl",
        "bigquery",
        "csv",
        "tf-record",
        "tf-record-gzip",
        "file-list"
      ],
      "supportedOutputStorageFormats": [
        "jsonl",
        "bigquery"
      ],
      "createTime": "2022-08-04T14:11:58.599736Z",
      "updateTime": "2022-08-04T14:12:01.166117Z",
      "etag": "AMEw9yNjmCwSyM4YLnftSHzzrI6Q3FWiIjTuQLd54oKZ-wGaHroIIX5Z5jB8l-Pauxk=",
      "supportedExportFormats": [
        {
          "id": "custom-trained",
          "exportableContents": [
            "ARTIFACT"
          ]
        }
      ],
      "artifactUri": "gs://tribbute-ml-central/ch9/trained_model/export/flights_20220804-141057/",
      "versionId": "1",
      "versionAliases": [
        "default"
      ],
      "versionCreateTime": "2022-08-04T14:11:58.599736Z",
      "versionUpdateTime": "2022-08-04T14:12:01.166117Z"
    }
  ]
}
" as type GoogleCloudAiplatformV1ListModelsResponse: Repeated values for field supportedDeploymentResourcesTypes may not be None

If you would like to report this issue, please run the following command:
  gcloud feedback

To check gcloud for common problems, please run the following command:
  gcloud info --run-diagnostics
Using endpoint [https://us-central1-aiplatform.googleapis.com/]
ERROR: (gcloud.ai.endpoints.deploy-model) could not parse resource []
---------------------------------------------------------------------------
CalledProcessError                        Traceback (most recent call last)
/tmp/ipykernel_1/1002185504.py in <module>
----> 1 get_ipython().run_cell_magic('bash', '', '\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n\nif [[ $(gcloud ai endpoints list --region=$REGION \\\n        --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n    echo "Endpoint for $MODEL_NAME already exists"\nelse\n    # create model\n    echo "Creating Endpoint for $MODEL_NAME"\n    gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\n\nENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \\\n              --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n    echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n    gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n\n# upload model\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n     --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n     --artifact-uri=$EXPORT_PATH\nMODEL_ID=$(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n\n# deploy model to endpoint\ngcloud ai endpoints deploy-model $ENDPOINT_ID \\\n  --region=$REGION \\\n  --model=$MODEL_ID \\\n  --display-name=$MODEL_NAME \\\n  --machine-type=n1-standard-2 \\\n  --min-replica-count=1 \\\n  --max-replica-count=1 \\\n  --traffic-split=0=100\n\n')

/opt/conda/lib/python3.7/site-packages/IPython/core/interactiveshell.py in run_cell_magic(self, magic_name, line, cell)
   2470             with self.builtin_trap:
   2471                 args = (magic_arg_s, cell)
-> 2472                 result = fn(*args, **kwargs)
   2473             return result
   2474 

/opt/conda/lib/python3.7/site-packages/IPython/core/magics/script.py in named_script_magic(line, cell)
    140             else:
    141                 line = script
--> 142             return self.shebang(line, cell)
    143 
    144         # write a basic docstring:

/opt/conda/lib/python3.7/site-packages/decorator.py in fun(*args, **kw)
    230             if not kwsyntax:
    231                 args, kw = fix(args, kw, sig)
--> 232             return caller(func, *(extras + args), **kw)
    233     fun.__name__ = func.__name__
    234     fun.__doc__ = func.__doc__

/opt/conda/lib/python3.7/site-packages/IPython/core/magic.py in <lambda>(f, *a, **k)
    185     # but it's overkill for just that one bit of state.
    186     def magic_deco(arg):
--> 187         call = lambda f, *a, **k: f(*a, **k)
    188 
    189         if callable(arg):

/opt/conda/lib/python3.7/site-packages/IPython/core/magics/script.py in shebang(self, line, cell)
    243             sys.stderr.flush()
    244         if args.raise_error and p.returncode!=0:
--> 245             raise CalledProcessError(p.returncode, cell, output=out, stderr=err)
    246 
    247     def _run_script(self, p, cell, to_close):

CalledProcessError: Command 'b'\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n\nif [[ $(gcloud ai endpoints list --region=$REGION \\\n        --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n    echo "Endpoint for $MODEL_NAME already exists"\nelse\n    # create model\n    echo "Creating Endpoint for $MODEL_NAME"\n    gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\n\nENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \\\n              --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n    echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n    gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n\n# upload model\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\\n     --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest \\\n     --artifact-uri=$EXPORT_PATH\nMODEL_ID=$(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n\n# deploy model to endpoint\ngcloud ai endpoints deploy-model $ENDPOINT_ID \\\n  --region=$REGION \\\n  --model=$MODEL_ID \\\n  --display-name=$MODEL_NAME \\\n  --machine-type=n1-standard-2 \\\n  --min-replica-count=1 \\\n  --max-replica-count=1 \\\n  --traffic-split=0=100\n\n'' returned non-zero exit status 1.

For some reason, the Model ID is not retrieved. I went to check if the model was created and flights-20220804-141148 had been created (which is the correct model created)

Thanks, Chris

shinchri commented 2 years ago

While I was looking for a solution, I found that other people seem to be having the same issue with gcloud ai models list: https://issuetracker.google.com/issues/237019264

It seems like one of them solved issue by updating gcloud to most recent version so I tried this at the top of the notebook:

%%bash
pip install --upgrade gcloud

However, that still didn't seem to work for me.

So far the only workaround seems to be manually deploying model to the endpoint. You can do that by creating first endpoint and uploading model. Go the Vertex AI Model Registry. Click the round hamburger dropdown menu and select "Deploy to Endpoint". You can see the progress by going to Endpoints and selecting the endpoint.

lakshmanok commented 2 years ago

https://cloud.google.com/vertex-ai/docs/training/pre-built-containers

thanks, Lak

On Wed, Aug 3, 2022, 7:37 PM Lakshmanan Valliappa @.***> wrote:

These are the supported docker images in 2.9 Is the name that the script generates correct?

thanks, Lak

On Wed, Aug 3, 2022, 7:35 PM Chris Shin @.***> wrote:

@lakshmanok https://github.com/lakshmanok I found out why I kept getting different versions. For some reason the code to delete any existing models doesn't work. So I had to manually delete the models that were registered (and there were many).

Once I fixed that I stopped getting multiple models but I still had a problem. Here is the error code that I have now (I tried both 2-6 and 2-9 but they give the same error message but with different version):

Using endpoint [https://us-central1-aiplatform.googleapis.com/] WARNING: The following filter keys were not present in any resource : display_name Using endpoint [https://us-central1-aiplatform.googleapis.com/] Waiting for operation [1000784623292121088]... .....done. Created Vertex AI endpoint: projects/591020730428/locations/us-central1/endpoints/7596789719095050240. Using endpoint [https://us-central1-aiplatform.googleapis.com/] Using endpoint [https://us-central1-aiplatform.googleapis.com/] WARNING: The following filter keys were not present in any resource : display_name Using endpoint [https://us-central1-aiplatform.googleapis.com/] Waiting for operation [5612470641719508992]... ....................done. Using endpoint [https://us-central1-aiplatform.googleapis.com/] Using endpoint [https://us-central1-aiplatform.googleapis.com/] ERROR: (gcloud.ai.endpoints.deploy-model) INVALID_ARGUMENT: Invalid image "us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-9:latest" for deployment. Please use a Model with a valid image.

CalledProcessError Traceback (most recent call last) /tmp/ipykernel_1/1002185504.py in ----> 1 get_ipython().run_cell_magic('bash', '', '\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n\nif [[ $(gcloud ai endpoints list --region=$REGION \\n --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n echo "Endpoint for $MODEL_NAME already exists"\nelse\n # create model\n echo "Creating Endpoint for $MODEL_NAME"\n gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\n\nENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \\n --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n\n# upload model\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\n --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest http://us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.$%7BTF_VERSION%7D:latest \\n --artifact-uri=$EXPORT_PATH\nMODEL_ID=$(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n\n# deploy model to endpoint\ngcloud ai endpoints deploy-model $ENDPOINT_ID \\n --region=$REGION \\n --model=$MODEL_ID \\n --display-name=$MODEL_NAME \\n --machine-type=n1-standard-2 \\n --min-replica-count=1 \\n --max-replica-count=1 \\n --traffic-split=0=100\n\n')

/opt/conda/lib/python3.7/site-packages/IPython/core/interactiveshell.py in run_cell_magic(self, magic_name, line, cell) 2470 with self.builtin_trap: 2471 args = (magic_arg_s, cell) -> 2472 result = fn(*args, **kwargs) 2473 return result 2474

/opt/conda/lib/python3.7/site-packages/IPython/core/magics/script.py in named_script_magic(line, cell) 140 else: 141 line = script --> 142 return self.shebang(line, cell) 143 144 # write a basic docstring:

/opt/conda/lib/python3.7/site-packages/decorator.py in fun(*args, *kw) 230 if not kwsyntax: 231 args, kw = fix(args, kw, sig) --> 232 return caller(func, (extras + args), **kw) 233 fun.name = func.name 234 fun.doc = func.doc

/opt/conda/lib/python3.7/site-packages/IPython/core/magic.py in (f, *a, k) 185 # but it's overkill for just that one bit of state. 186 def magic_deco(arg): --> 187 call = lambda f, *a, *k: f(a, k) 188 189 if callable(arg):

/opt/conda/lib/python3.7/site-packages/IPython/core/magics/script.py in shebang(self, line, cell) 243 sys.stderr.flush() 244 if args.raise_error and p.returncode!=0: --> 245 raise CalledProcessError(p.returncode, cell, output=out, stderr=err) 246 247 def _run_script(self, p, cell, to_close):

CalledProcessError: Command 'b'\nTIMESTAMP=$(date +%Y%m%d-%H%M%S)\nMODEL_NAME=${ENDPOINT_NAME}-${TIMESTAMP}\nEXPORT_PATH=$(gsutil ls ${OUTDIR}/export | tail -1)\necho $EXPORT_PATH\n\nif [[ $(gcloud ai endpoints list --region=$REGION \\n --format=\'value(DISPLAY_NAME)\' --filter=display_name=${ENDPOINT_NAME}) ]]; then\n echo "Endpoint for $MODEL_NAME already exists"\nelse\n # create model\n echo "Creating Endpoint for $MODEL_NAME"\n gcloud ai endpoints create --region=${REGION} --display-name=${ENDPOINT_NAME}\nfi\n\nENDPOINT_ID=$(gcloud ai endpoints list --region=$REGION \\n --format=\'value(ENDPOINT_ID)\' --filter=display_name=${ENDPOINT_NAME})\necho "ENDPOINT_ID=$ENDPOINT_ID"\n\n# delete any existing models with this name\nfor MODEL_ID in $(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME}); do\n echo "Deleting existing $MODEL_NAME ... $MODEL_ID "\n gcloud ai models delete --region=$REGION $MODEL_ID\ndone\n\n# upload model\ngcloud beta ai models upload --region=$REGION --display-name=$MODEL_NAME \\n --container-image-uri=us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.${TF_VERSION}:latest http://us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.$%7BTF_VERSION%7D:latest \\n --artifact-uri=$EXPORT_PATH\nMODEL_ID=$(gcloud ai models list --region=$REGION --format=\'value(MODEL_ID)\' --filter=display_name=${MODEL_NAME})\necho "MODEL_ID=$MODEL_ID"\n\n# deploy model to endpoint\ngcloud ai endpoints deploy-model $ENDPOINT_ID \\n --region=$REGION \\n --model=$MODEL_ID \\n --display-name=$MODEL_NAME \\n --machine-type=n1-standard-2 \\n --min-replica-count=1 \\n --max-replica-count=1 \\n --traffic-split=0=100\n\n'' returned non-zero exit status 1.

The problem seems to be the docker image, but not sure where the image is coming from.

Thanks, Chris

— Reply to this email directly, view it on GitHub https://github.com/GoogleCloudPlatform/data-science-on-gcp/issues/154#issuecomment-1204687591, or unsubscribe https://github.com/notifications/unsubscribe-auth/AANJPZ7RYC7WNRSMF7CYWETVXMT5LANCNFSM55QPAVDA . You are receiving this because you were mentioned.Message ID: @.*** com>

jgammerman commented 1 year ago

I'm getting the same error as @shinchri . Tried using his workaround of manually deploying the model to a new endpoint using the GUI but unfortunately it still doesn't work.

lakshmanok commented 1 year ago

James,

Could you check the TensorFlow version you are using, and the list of supported versions at https://cloud.google.com/vertex-ai/docs/predictions/pre-built-containers

thanks Lak

On Mon, Feb 13, 2023 at 4:20 AM James @.***> wrote:

I'm getting the same error as @shinchri https://github.com/shinchri . Tried using his workaround of manually deploying the model to a new endpoint using the GUI but unfortunately it still doesn't work.

— Reply to this email directly, view it on GitHub https://github.com/GoogleCloudPlatform/data-science-on-gcp/issues/154#issuecomment-1427849824, or unsubscribe https://github.com/notifications/unsubscribe-auth/AANJPZ6EWTAJTGHRZBO6QG3WXIRJZANCNFSM55QPAVDA . You are receiving this because you were mentioned.Message ID: @.***>

jgammerman commented 1 year ago

Thanks! I downgraded my TF version from 2.11 to 2.9 and that fixed the error :)

Interestingly the returned predictions don't sum to 1:

{
  "predictions": [
    [
      0.582659423
    ],
    [
      0.973581493
    ]
  ],
  "deployedModelId": "2397282794226057216",
  "model": "projects/506913857436/locations/us-central1/models/372200079164964864",
  "modelDisplayName": "flights-v2-20230213-141845",
  "modelVersionId": "1"
}

whereas yours were 0.22877 and 0.76613 respectively, which sum to ~1. Any idea why that might be @lakshmanok?

For anyone who needs to fix the deployment error - I opened a terminal, ran pip uninstall tensorflow to uninstall TF (also run pip show tensorflow to check it worked) then ran pip install tensorflow==2.9.0 --user.

Restart the kernel, change your endpoint name (I changed mine from flights to flights-v2) and the notebook should work now.

lakshmanok commented 1 year ago

Lol. That's extremely unlikely coincidence. Note that the predictions are for two different inputs, so there is no reason whatsoever that they should sum to 1.

{"instances": [ {"dep_hour": 2, "is_weekday": 1, "dep_delay": 40, "taxi_out": 17, "distance": 41, "carrier": "AS", "dep_airport_lat": 58.42527778, "dep_airport_lon": -135.7075, "arr_airport_lat": 58.35472222, "arr_airport_lon": -134.57472222, "origin": "GST", "dest": "JNU"}, {"dep_hour": 22, "is_weekday": 0, "dep_delay": -7, "taxi_out": 7, "distance": 201, "carrier": "HA", "dep_airport_lat": 21.97611111, "dep_airport_lon": -159.33888889, "arr_airport_lat": 20.89861111, "arr_airport_lon": -156.43055556, "origin": "LIH", "dest": "OGG"}]}

On Mon, Feb 13, 2023 at 6:44 AM James @.***> wrote:

Thanks! I downgraded my TF version from 2.11 to 2.9 and that fixed the error :)

Interestingly the returned predictions don't sum to 1:

{ "predictions": [ [ 0.582659423 ], [ 0.973581493 ] ], "deployedModelId": "2397282794226057216", "model": "projects/506913857436/locations/us-central1/models/372200079164964864", "modelDisplayName": "flights-v2-20230213-141845", "modelVersionId": "1" }

whereas yours were 0.22877 and 0.76613 respectively, which sum to ~1. Any idea why that might be @lakshmanok https://github.com/lakshmanok?

For anyone who needs to fix the deployment error - I opened a terminal, ran pip uninstall tensorflow to uninstall TF (also run pip show tensorflow to check it worked) then ran pip install tensorflow==2.9.0 --user.

Restart the kernel, change your endpoint name (I changed mine from flights to flights-v2) and the notebook should work now.

— Reply to this email directly, view it on GitHub https://github.com/GoogleCloudPlatform/data-science-on-gcp/issues/154#issuecomment-1428058881, or unsubscribe https://github.com/notifications/unsubscribe-auth/AANJPZ6VYQXAVLWEWTGAEIDWXJCF5ANCNFSM55QPAVDA . You are receiving this because you were mentioned.Message ID: @.***>

jgammerman commented 1 year ago

Ah yes, good point!