Closed hirayukis closed 1 year ago
Hi @hirayukis what was the cause/resolution to this error?
Hi @arpitjasa-db thank you for your confirm.
Strictly speaking, it hasn't been resolved yet.
I added TF_CLI_ARGS: -parallelism=1
in github workflow.
It will be sometimes problem solved(no 429 error). But there are not exist errors in logs.
time=2023-05-25T07:06:24.972Z level=INFO source=state_pull.go:38 msg="Remote state file does not exist" mutator=deploy mutator=deferred mutator=terraform:state-pull
time=2023-05-25T07:06:24.972Z level=DEBUG source=mutator.go:27 msg=Apply mutator=deploy mutator=deferred mutator=terraform.Apply
Starting resource deployment
time=2023-05-25T07:06:26.876Z level=ERROR source=mutator.go:30 msg="Error: terraform apply: exit status 1\n\nError: failed to read schema for databricks_permissions.mlflow_model_model in registry.terraform.io/databricks/databricks: failed to instantiate provider \"registry.terraform.io/databricks/databricks\" to obtain schema: Unrecognized remote plugin message: \n\nThis usually means that the plugin is either invalid or simply\nneeds to be recompiled to support the latest protocol.\n\n" mutator=deploy mutator=deferred mutator=terraform.Apply
time=2023-05-25T07:06:26.876Z level=DEBUG source=mutator.go:27 msg=Apply mutator=deploy mutator=deferred mutator=lock:release
time=2023-05-25T07:06:26.876Z level=INFO source=release.go:34 msg="Releasing deployment lock" mutator=deploy mutator=deferred mutator=lock:release
time=2023-05-25T07:06:27.068Z level=DEBUG source=client.go:255 msg="GET /api/2.0/workspace-files/Users/***/.bundle/my-mlops-project/test/state/deploy.lock\n< HTTP/2.0 200 OK\n< {\n< \"AcquisitionTime\": \"2023-05-25T07:06:21.177480483Z\",\n< \"ID\": \"f5c50a2e-4469-4828-a737-60ff9c330da9\",\n< \"IsForced\": false,\n< \"User\": \"***\"\n< }" mutator=deploy mutator=deferred mutator=lock:release sdk=true
time=2023-05-25T07:06:27.232Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace/delete\n> {\n> \"path\": \"/Users/***/.bundle/my-mlops-project/test/state/deploy.lock\"\n> }\n< HTTP/2.0 200 OK\n< {}" mutator=deploy mutator=deferred mutator=lock:release sdk=true
time=2023-05-25T07:06:27.232Z level=ERROR source=mutator.go:30 msg="Error: terraform apply: exit status 1\n\nError: failed to read schema for databricks_permissions.mlflow_model_model in registry.terraform.io/databricks/databricks: failed to instantiate provider \"registry.terraform.io/databricks/databricks\" to obtain schema: Unrecognized remote plugin message: \n\nThis usually means that the plugin is either invalid or simply\nneeds to be recompiled to support the latest protocol.\n\n" mutator=deploy mutator=deferred
Error: terraform apply: exit status 1
Error: failed to read schema for databricks_permissions.mlflow_model_model in registry.terraform.io/databricks/databricks: failed to instantiate provider "registry.terraform.io/databricks/databricks" to obtain schema: Unrecognized remote plugin message:
This usually means that the plugin is either invalid or simply
needs to be recompiled to support the latest protocol.
time=2023-05-25T07:06:27.232Z level=ERROR source=root.go:96 msg="failed execution" exit_code=1 error="terraform apply: exit status 1\n\nError: failed to read schema for databricks_permissions.mlflow_model_model in registry.terraform.io/databricks/databricks: failed to instantiate provider \"registry.terraform.io/databricks/databricks\" to obtain schema: Unrecognized remote plugin message: \n\nThis usually means that the plugin is either invalid or simply\nneeds to be recompiled to support the latest protocol.\n\n"
Error: Process completed with exit code 1.
It will be sometimes problem solved(no 429 error). But there are not exist errors in logs.
Hi @hirayukis, 429 error
generally refers to a rate-limiting error. It seems you may be running into some rate limits?
Yes, may be.
I use azure databricks. There are rate limits but they can't be conrtolled.
https://docs.databricks.com/resources/limits.html#limits-api-rate-limits
https://learn.microsoft.com/ja-jp/azure/databricks/resources/limits#limits-api-rate-limits
Is this happening only in my environment? 😔
@hirayukis how often are you running into rate-limits? Are there any other users in your workspace who could be heavily using the APIs?
@arpitjasa-db
I'm not sure because it depends on databricks bundle deploy
at this code. https://github.com/databricks/mlops-stack/blob/49d5615954d80641ed446a6608af816d9ae4d16c/%7B%7Bcookiecutter.root_dir__update_if_you_intend_to_use_monorepo%7D%7D/.github/workflows/%7B%7Bcookiecutter.project_name%7D%7D-run-tests-fs.yml#L57
And no one else is using this workspace.
But it seems to 52 times by logs, counting POST /api/2.0/workspace-files/****
Run databricks bundle deploy -e test --log-level DEBUG
time=2023-05-25T15:28:41.919Z level=INFO source=root.go:41 msg=start version=0.100.1 args="databricks, bundle, deploy, -e, test, --log-level, DEBUG"
time=2023-05-25T15:28:41.921Z level=DEBUG source=mutator.go:27 msg=Apply mutator=DefineDefaultInclude
time=2023-05-25T15:28:41.921Z level=DEBUG source=mutator.go:27 msg=Apply mutator=ProcessRootIncludes
time=2023-05-25T15:28:41.921Z level=DEBUG source=mutator.go:27 msg=Apply mutator=ProcessRootIncludes mutator=ProcessInclude(databricks-resources/ml-artifacts-resource.yml)
time=2023-05-25T15:28:41.922Z level=DEBUG source=mutator.go:27 msg=Apply mutator=ProcessRootIncludes mutator=ProcessInclude(databricks-resources/model-workflow-resource.yml)
time=2023-05-25T15:28:41.922Z level=DEBUG source=mutator.go:27 msg=Apply mutator=ProcessRootIncludes mutator=ProcessInclude(databricks-resources/batch-inference-workflow-resource.yml)
time=2023-05-25T15:28:41.923Z level=DEBUG source=mutator.go:27 msg=Apply mutator=ProcessRootIncludes mutator=ProcessInclude(databricks-resources/monitoring-workflow-resource.yml)
time=2023-05-25T15:28:41.923Z level=DEBUG source=mutator.go:27 msg=Apply mutator=DefineDefaultEnvironment(default)
time=2023-05-25T15:28:41.923Z level=DEBUG source=mutator.go:27 msg=Apply mutator=LoadGitDetails
time=2023-05-25T15:28:41.923Z level=DEBUG source=mutator.go:27 msg=Apply mutator=SelectEnvironment(test)
time=2023-05-25T15:28:41.923Z level=DEBUG source=mutator.go:27 msg=Apply mutator=initialize
time=2023-05-25T15:28:41.923Z level=INFO source=phase.go:30 msg="Phase: initialize" mutator=initialize
time=2023-05-25T15:28:41.923Z level=DEBUG source=mutator.go:27 msg=Apply mutator=initialize mutator=PopulateCurrentUser
time=2023-05-25T15:28:41.923Z level=INFO source=auth_azure_client_secret.go:53 msg="Generating AAD token for Service Principal (***)" mutator=initialize mutator=PopulateCurrentUser sdk=true
time=2023-05-25T15:28:43.809Z level=DEBUG source=client.go:255 msg="GET /api/2.0/preview/scim/v2/Me\n< HTTP/2.0 200 OK\n< {\n< \"active\": true,\n< \"displayName\": \"study-service-principal\",\n< \"emails\": [\n< {\n< \"primary\": true,\n< \"type\": \"work\",\n< \"value\": \"***\"\n< }\n< ],\n< \"entitlements\": [\n< {\n< \"value\": \"workspace-access\"\n< },\n< {\n< \"value\": \"databricks-sql-access\"\n< },\n< {\n< \"value\": \"allow-cluster-create\"\n< }\n< ],\n< \"externalId\": \"58bc4b48-bb74-4e19-b84a-9a19[11](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:12)1014a2\",\n< \"groups\": [\n< {\n< \"$ref\": \"Groups/636044987838930\",\n< \"display\": \"admins\",\n< \"type\": \"direct\",\n< \"value\": \"636044987838930\"\n< }\n< ],\n< \"id\": \"2861949997501535\",\n< \"name\": {\n< \"givenName\": \"study-service-principal\"\n< },\n< \"schemas\": [\n< \"urn:ietf:params:scim:schemas:core:2.0:User\",\n< \"urn:ietf:params:scim:schemas:extension:workspace:2.0:User\"\n< ],\n< \"userName\": \"***\"\n< }" mutator=initialize mutator=PopulateCurrentUser sdk=true
time=2023-05-25T15:28:43.809Z level=DEBUG source=mutator.go:27 msg=Apply mutator=initialize mutator=DefineDefaultWorkspaceRoot
time=2023-05-25T15:28:43.809Z level=DEBUG source=mutator.go:27 msg=Apply mutator=initialize mutator=ExpandWorkspaceRoot
time=2023-05-25T15:28:43.809Z level=DEBUG source=mutator.go:27 msg=Apply mutator=initialize mutator=DefaultWorkspacePaths
time=2023-05-25T15:28:43.809Z level=DEBUG source=mutator.go:27 msg=Apply mutator=initialize mutator=SetVariables
time=2023-05-25T15:28:43.809Z level=DEBUG source=mutator.go:27 msg=Apply mutator=initialize mutator=Interpolate
time=2023-05-25T15:28:43.810Z level=DEBUG source=mutator.go:27 msg=Apply mutator=initialize mutator=TranslatePaths
time=2023-05-25T15:28:43.810Z level=DEBUG source=mutator.go:27 msg=Apply mutator=initialize mutator=terraform.Initialize
time=2023-05-25T15:28:43.810Z level=DEBUG source=init.go:53 msg="Using Terraform at /home/runner/work/databricks-study/databricks-study/my_mlops_project/.databricks/bundle/test/bin/terraform" mutator=initialize mutator=terraform.Initialize
time=2023-05-25T15:28:43.810Z level=DEBUG source=init.go:151 msg="Environment variables for Terraform: TMPDIR, DATABRICKS_HOST, ARM_CLIENT_SECRET, ARM_CLIENT_ID, ARM_TENANT_ID, ARM_ENVIRONMENT, DATABRICKS_AUTH_TYPE, HOME" mutator=initialize mutator=terraform.Initialize
time=2023-05-25T15:28:43.810Z level=DEBUG source=mutator.go:27 msg=Apply mutator=build
time=2023-05-25T15:28:43.810Z level=INFO source=phase.go:30 msg="Phase: build" mutator=build
time=2023-05-25T15:28:43.810Z level=DEBUG source=mutator.go:27 msg=Apply mutator=build mutator=artifacts.BuildAll
time=2023-05-25T15:28:43.810Z level=DEBUG source=mutator.go:27 msg=Apply mutator=build mutator=Interpolate
time=2023-05-25T15:28:43.811Z level=DEBUG source=mutator.go:27 msg=Apply mutator=deploy
time=2023-05-25T15:28:43.811Z level=INFO source=phase.go:30 msg="Phase: deploy" mutator=deploy
time=2023-05-25T15:28:43.811Z level=DEBUG source=mutator.go:27 msg=Apply mutator=deploy mutator=deferred
time=2023-05-25T15:28:43.811Z level=DEBUG source=mutator.go:27 msg=Apply mutator=deploy mutator=deferred mutator=lock:acquire
time=2023-05-25T15:28:43.811Z level=INFO source=acquire.go:46 msg="Acquiring deployment lock (force: false)" mutator=deploy mutator=deferred mutator=lock:acquire
time=2023-05-25T15:28:44.520Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/state/deploy.lock?overwrite=false\n> {\n> \"AcquisitionTime\": \"2023-05-25T15:28:43.811495862Z\",\n> \"ID\": \"b967d8f9-4953-4e78-a558-3e75040e796f\",\n> \"IsForced\": false,\n> \"User\": \"***\"\n> }\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=lock:acquire sdk=true
time=2023-05-25T15:28:44.866Z level=DEBUG source=client.go:255 msg="GET /api/2.0/workspace-files/Users/***/.bundle/my-mlops-project/test/state/deploy.lock\n< HTTP/2.0 200 OK\n< {\n< \"AcquisitionTime\": \"2023-05-25T15:28:43.811495862Z\",\n< \"ID\": \"b967d8f9-4953-4e78-a558-3e75040e796f\",\n< \"IsForced\": false,\n< \"User\": \"***\"\n< }" mutator=deploy mutator=deferred mutator=lock:acquire sdk=true
time=2023-05-25T15:28:44.866Z level=DEBUG source=mutator.go:27 msg=Apply mutator=deploy mutator=deferred mutator=files.Upload
Starting upload of bundle files
time=2023-05-25T15:28:45.086Z level=DEBUG source=client.go:255 msg="GET /api/2.0/preview/scim/v2/Me\n< HTTP/2.0 200 OK\n< {\n< \"active\": true,\n< \"displayName\": \"study-service-principal\",\n< \"emails\": [\n< {\n< \"primary\": true,\n< \"type\": \"work\",\n< \"value\": \"***\"\n< }\n< ],\n< \"entitlements\": [\n< {\n< \"value\": \"workspace-access\"\n< },\n< {\n< \"value\": \"databricks-sql-access\"\n< },\n< {\n< \"value\": \"allow-cluster-create\"\n< }\n< ],\n< \"externalId\": \"58bc4b48-bb74-4e19-b84a-9a19111014a2\",\n< \"groups\": [\n< {\n< \"$ref\": \"Groups/636044987838930\",\n< \"display\": \"admins\",\n< \"type\": \"direct\",\n< \"value\": \"636044987838930\"\n< }\n< ],\n< \"id\": \"2861949997501535\",\n< \"name\": {\n< \"givenName\": \"study-service-principal\"\n< },\n< \"schemas\": [\n< \"urn:ietf:params:scim:schemas:core:2.0:User\",\n< \"urn:ietf:params:scim:schemas:extension:workspace:2.0:User\"\n< ],\n< \"userName\": \"***\"\n< }" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.225Z level=DEBUG source=client.go:255 msg="GET /api/2.0/workspace/get-status?path=/Users/***/.bundle/my-mlops-project/test/files\n< HTTP/2.0 200 OK\n< {\n< \"object_id\": 254075285984279,\n< \"object_type\": \"DIRECTORY\",\n< \"path\": \"/Users/***/.bundle/my-mlops-project/test/files\"\n< }" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.225Z level=DEBUG source=path.go:104 msg="Path /Users/***/.bundle/my-mlops-project/test/files has type directory (ID: 254075285984279)" mutator=deploy mutator=deferred mutator=files.Upload
time=2023-05-25T15:28:45.723Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/deployment/batch_inference/predict.py?overwrite=true\n[non-JSON document of 1016 bytes]. import mlflow\nfrom pyspark.sql.functions import struct, lit, to_timestamp\n\n\ndef predict_batch(\n ... (920 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.745Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/deployment/batch_inference/README.md?overwrite=true\n[non-JSON document of 571 bytes]. # Batch Inference\nTo set up batch inference job via scheduled Databricks workflow, please refer ... (475 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.809Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/databricks-resources/monitoring-workflow-resource.yml?overwrite=true\n[non-JSON document of 45 bytes]. # TODO: Add data monitoring support for mlops\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.810Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/databricks-resources/batch-inference-workflow-resource.yml?overwrite=true\n[non-JSON document of 1466 bytes]. new_cluster: &new_cluster\n new_cluster:\n num_workers: 3\n spark_version: [12](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:13).2.x-cpu-ml-sca... ([13](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:14)70 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.810Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/databricks-resources/model-workflow-resource.yml?overwrite=true\n[non-JSON document of 4765 bytes]. new_cluster: &new_cluster\n new_cluster:\n num_workers: 3\n spark_version: 12.2.x-cpu-ml-sca... (4669 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.811Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/deployment/batch_inference/Prepare data.py?overwrite=true\n[non-JSON document of 263 bytes]. # Databricks notebook source\ndf = spark.table(\n \"delta.`dbfs:/databricks-datasets/nyctaxi-wit... (167 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.812Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/__init__.py?overwrite=true\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.8[14](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:15)Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/pytest.ini?overwrite=true\n[non-JSON document of 187 bytes]. # Configure pytest to detect local modules in the current directory\n# See https://docs.pytest.or... (91 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T[15](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:16):28:45.861Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/bundle.yml?overwrite=true\n[non-JSON document of 1369 bytes]. variables:\n experiment_name:\n description: Experiment name for the model training.\n defau... (1273 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.869Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/README.md?overwrite=true\n[non-JSON document of 226 bytes]. # my-mlops-project\n\nThis directory contains python code, notebooks and ML resource configs relat... (130 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.948Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/deployment/model_deployment/deploy.py?overwrite=true\n[non-JSON document of 1334 bytes]. import sys\nimport pathlib\n\nsys.path.append(str(pathlib.Path(__file__).parent.parent.parent.resol... (1238 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.948Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/databricks-resources/README.md?overwrite=true\n[non-JSON document of [16](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:17)502 bytes]. # Databricks ML Resource Configurations\n[(back to main README)](../../README.md)\n\n## Table of co... (16406 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.949Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/requirements.txt?overwrite=true\n[non-JSON document of 124 bytes]. mlflow==2.2.2\nnumpy>=1.23.0\npandas>=1.4.3\nscikit-learn>=1.1.1\nmatplotlib>=3.5.2\nJinja2==3.0.3\npy... (28 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.949Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/tests/training/ingest_test.py?overwrite=true\n[non-JSON document of 587 bytes]. import pytest\nimport os\nimport tempfile\nimport pandas as pd\nfrom pandas import DataFrame\nfrom my... (491 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.949Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/deployment/batch_inference/notebooks/BatchInference.py?overwrite=true\n[non-JSON document of 3454 bytes]. # Databricks notebook source\n###################################################################... (3358 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.949Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/monitoring/README.md?overwrite=true\n[non-JSON document of 141 bytes]. # Monitoring\n\nDatabricks Data Monitoring is currently in Private Preview. \n\nPlease contact a Dat... (45 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.950Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/tests/__init__.py?overwrite=true\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.951Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/databricks-resources/ml-artifacts-resource.yml?overwrite=true\n[non-JSON document of 1904 bytes]. # Environment specific values\nenvironments:\n dev:\n resources:\n models:\n model:\n ... (1808 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:45.951Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/tests/training/__init__.py?overwrite=true\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.010Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/deployment/model_deployment/notebooks/ModelDeployment.py?overwrite=true\n[non-JSON document of 2525 bytes]. # Databricks notebook source\n###################################################################... (2429 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.214Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/tests/training/split_test.py?overwrite=true\n[non-JSON document of 1105 bytes]. import pytest\nimport os\nimport pandas as pd\nfrom pandas import DataFrame\nfrom my_mlops_project.t... (1009 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.232Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/tests/training/test_notebooks.py?overwrite=true\n[non-JSON document of 309 bytes]. import pathlib\n\n\ndef test_notebook_format():\n # Verify that all Databricks notebooks have the... (213 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.274Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/tests/training/test_sample.parquet?overwrite=true\n[non-JSON document of 5613 bytes]. PAR1\x15\x04\x15\xa0\x01\x15\xa6\x01L\x15\x14\x15\x04\x12\x00\x00P\xf0O@t[\xbc\xad+\x05\x00@\xa7\xa1\xf5\xaa+\x05\x00\x80&\x94%!\x1f(MISSING)+\x05\x00\xc0\x96#^\x97+\x05\x00\x00c\x14gm,\x05\x00\xc0?1\f\x9c+\x05\x00@\xaa\xde\x05\x14,\x05\x00@\xe8\x86\x1d\x11,\x05\x00\x80\xe1kY\xdb*\x05\x00\xc0... (55[17](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:18) more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.277Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/tests/training/train_test.py?overwrite=true\n[non-JSON document of 427 bytes]. from my_mlops_project.training.steps.train import estimator_fn\nfrom sklearn.utils.estimator_chec... (331 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.294Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/__init__.py?overwrite=true\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.3[18](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:19)Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/README.md?overwrite=true\n[non-JSON document of 7322 bytes]. # ML Developer Guide using MLflow Recipes\n\n[(back to main README)](../../README.md)\n\n## Table of... (7226 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.328Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/tests/training/transform_test.py?overwrite=true\n[non-JSON document of 279 bytes]. from my_mlops_project.training.steps.transform import transformer_fn\n\n\ndef test_tranform_fn_retu... (183 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.410Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/notebooks/Train.py?overwrite=true\n[non-JSON document of 4792 bytes]. # Databricks notebook source\n###################################################################... (4696 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.413Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/profiles/databricks-staging.yaml?overwrite=true\n[non-JSON document of 2161 bytes]. experiment:\n # The name of the experiment to use during training or model validation in staging... (2065 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.413Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/profiles/databricks-dev.yaml?overwrite=true\n[non-JSON document of 2062 bytes]. # This profile contains config overrides for model development on Databricks\nexperiment:\n # The... ([19](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:20)66 more bytes)\n< HTTP/2.0 [20](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:21)0 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.416Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/recipe.yaml?overwrite=true\n[non-JSON document of 2751 bytes]. # `recipe.yaml` is the main configuration file for an MLflow Recipe.\n# Required recipe parameter... (2655 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.419Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/steps/split.py?overwrite=true\n[non-JSON document of 1380 bytes]. \"\"\"\nThis module defines the following routines used by the 'split' step of the regression recipe... (1284 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.430Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/steps/custom_metrics.py?overwrite=true\n[non-JSON document of 1937 bytes]. \"\"\"\nThis module defines custom metric functions that are invoked during the 'train' and 'evaluat... (1841 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.431Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/profiles/local.yaml?overwrite=true\n[non-JSON document of 1631 bytes]. experiment:\n name: \"/Shared/my-mlops-project\"\n tracking_uri: \"sqlite:///mlruns.db\"\n artifact_... (1535 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.432Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/profiles/databricks-prod.yaml?overwrite=true\n[non-JSON document of 1968 bytes]. experiment:\n # The name of the experiment to use during training or model validation in prod en... (1872 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.441Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/steps/ingest.py?overwrite=true\n[non-JSON document of 1565 bytes]. \"\"\"\nThis module defines the following routines used by the 'ingest' step of the regression recip... (1469 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.441Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/profiles/databricks-test.yaml?overwrite=true\n[non-JSON document of [21](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:22)31 bytes]. experiment:\n # The name of the experiment to use during training or model validation in test en... (2035 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.441Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/steps/__init__.py?overwrite=true\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.489Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/steps/train.py?overwrite=true\n[non-JSON document of 558 bytes]. \"\"\"\nThis module defines the following routines used by the 'train' step of the regression recipe... (462 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.552Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/steps/transform.py?overwrite=true\n[non-JSON document of [22](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:23)72 bytes]. \"\"\"\nThis module defines the following routines used by the 'transform' step of the regression re... (2176 more bytes)\n< HTTP/2.0 429 Too Many Requests (Error: Current request has to be retried)\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=20[23](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:24)-05-25T15:28:46.568Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/utils.py?overwrite=true\n[non-JSON document of 949 bytes]. \"\"\"\nThis module contains utils shared between different notebooks\n\"\"\"\nimport json\nimport mlflow\n... (853 more bytes)\n< HTTP/2.0 429 Too Many Requests (Error: Current request has to be retried)\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.621Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/validation/notebooks/ModelValidation.py?overwrite=true\n[non-JSON document of 1[24](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:25)05 bytes]. # Databricks notebook source\n###################################################################... (12309 more bytes)\n< HTTP/2.0 429 Too Many Requests (Error: Current request has to be retried)\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-[25](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:26)T15:28:46.621Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/validation/README.md?overwrite=true\n[non-JSON document of 188 bytes]. # Model Validation\nTo enable model validation as part of scheduled databricks workflow, please r... (92 more bytes)\n< HTTP/2.0 429 Too Many Requests (Error: Current request has to be retried)\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.630Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/validation/validation.py?overwrite=true\n[non-JSON document of 2053 bytes]. import numpy as np\nfrom mlflow.models import make_metric, MetricThreshold\n\n# Custom metrics to b... (1957 more bytes)\n< HTTP/2.0 429 Too Many Requests (Error: Current request has to be retried)\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:46.936Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/data/sample.parquet?overwrite=true\n[non-JSON document of 232389 bytes]. PAR1\x15\x04\x15\xc0\xe0\t\x15\xac\xde\bL\x15\x88\x9c\x01\x15\x04\x12\x00\x00\xa0\xf0\x04\xb0@t[\xbc\xad+\x05\x00@\xa7\xa1\xf5\xaa+\x05\x00\x80&\x94%!\x1f(MISSING)+\x05\x00\xc0\x96#^\x97+\x05\x00\x00c\x14gm,\x05\x00\xc0?1\f\x9c\x01(l\xaa\xde\x05\x14,\x05\x00@\xe8\x86\x1d\x11,\x05\x00\x80\xe1kY\xdb... (232293 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:48.190Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/training/steps/transform.py?overwrite=true\n[non-JSON document of 2272 bytes]. \"\"\"\nThis module defines the following routines used by the 'transform' step of the regression re... (2176 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:48.205Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/validation/README.md?overwrite=true\n[non-JSON document of 188 bytes]. # Model Validation\nTo enable model validation as part of scheduled databricks workflow, please r... (92 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:48.278Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/utils.py?overwrite=true\n[non-JSON document of 949 bytes]. \"\"\"\nThis module contains utils shared between different notebooks\n\"\"\"\nimport json\nimport mlflow\n... (853 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:48.623Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/validation/validation.py?overwrite=true\n[non-JSON document of 2053 bytes]. import numpy as np\nfrom mlflow.models import make_metric, MetricThreshold\n\n# Custom metrics to b... (1957 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
time=2023-05-25T15:28:48.944Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace-files/import-file/Users/***/.bundle/my-mlops-project/test/files/validation/notebooks/ModelValidation.py?overwrite=true\n[non-JSON document of 12405 bytes]. # Databricks notebook source\n###################################################################... (12309 more bytes)\n< HTTP/2.0 200 OK\n" mutator=deploy mutator=deferred mutator=files.Upload sdk=true
Uploaded bundle files at /Users/***/.bundle/my-mlops-project/test/files!
time=2023-05-25T15:28:48.945Z level=DEBUG source=mutator.go:27 msg=Apply mutator=deploy mutator=deferred mutator=artifacts.UploadAll
time=2023-05-25T15:28:48.945Z level=DEBUG source=mutator.go:27 msg=Apply mutator=deploy mutator=deferred mutator=Interpolate
time=2023-05-25T15:28:48.945Z level=DEBUG source=mutator.go:27 msg=Apply mutator=deploy mutator=deferred mutator=terraform.Write
time=2023-05-25T15:28:48.958Z level=DEBUG source=mutator.go:27 msg=Apply mutator=deploy mutator=deferred mutator=terraform:state-pull
time=2023-05-25T15:28:48.958Z level=INFO source=state_pull.go:33 msg="Opening remote state file" mutator=deploy mutator=deferred mutator=terraform:state-pull
time=2023-05-25T15:28:49.366Z level=DEBUG source=client.go:255 msg="GET /api/2.0/workspace-files/Users/***/.bundle/my-mlops-project/test/state/terraform.tfstate\n< HTTP/2.0 404 Not Found\n< {\n< \"message\": \"File not found.\"\n< }" mutator=deploy mutator=deferred mutator=terraform:state-pull sdk=true
time=2023-05-25T15:28:49.366Z level=INFO source=state_pull.go:38 msg="Remote state file does not exist" mutator=deploy mutator=deferred mutator=terraform:state-pull
time=2023-05-25T15:28:49.366Z level=DEBUG source=mutator.go:27 msg=Apply mutator=deploy mutator=deferred mutator=terraform.Apply
Starting resource deployment
time=2023-05-25T15:28:51.053Z level=ERROR source=mutator.go:30 msg="Error: terraform apply: exit status 1\n\nError: failed to read schema for databricks_job.batch_inference_job in registry.terraform.io/databricks/databricks: failed to instantiate provider \"registry.terraform.io/databricks/databricks\" to obtain schema: Unrecognized remote plugin message: \n\nThis usually means that the plugin is either invalid or simply\nneeds to be recompiled to support the latest protocol.\n\n" mutator=deploy mutator=deferred mutator=terraform.Apply
time=2023-05-25T15:28:51.053Z level=DEBUG source=mutator.go:27 msg=Apply mutator=deploy mutator=deferred mutator=lock:release
time=2023-05-25T15:28:51.053Z level=INFO source=release.go:34 msg="Releasing deployment lock" mutator=deploy mutator=deferred mutator=lock:release
time=2023-05-25T15:28:51.[26](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:27)2Z level=DEBUG source=client.go:255 msg="GET /api/2.0/workspace-files/Users/***/.bundle/my-mlops-project/test/state/deploy.lock\n< HTTP/2.0 200 OK\n< {\n< \"AcquisitionTime\": \"2023-05-25T15:[28](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:29):43.811495862Z\",\n< \"ID\": \"b967d8f9-4953-4e78-a558-3e75040e796f\",\n< \"IsForced\": false,\n< \"User\": \"***\"\n< }" mutator=deploy mutator=deferred mutator=lock:release sdk=true
time=2023-05-25T15:28:51.413Z level=DEBUG source=client.go:255 msg="POST /api/2.0/workspace/delete\n> {\n> \"path\": \"/Users/***/.bundle/my-mlops-project/test/state/deploy.lock\"\n> }\n< HTTP/2.0 200 OK\n< {}" mutator=deploy mutator=deferred mutator=lock:release sdk=true
time=2023-05-25T15:28:51.413Z level=ERROR source=mutator.go:[30](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:31) msg="Error: terraform apply: exit status 1\n\nError: failed to read schema for databricks_job.batch_inference_job in registry.terraform.io/databricks/databricks: failed to instantiate provider \"registry.terraform.io/databricks/databricks\" to obtain schema: Unrecognized remote plugin message: \n\nThis usually means that the plugin is either invalid or simply\nneeds to be recompiled to support the latest protocol.\n\n" mutator=deploy mutator=deferred
Error: terraform apply: exit status 1
Error: failed to read schema for databricks_job.batch_inference_job in registry.terraform.io/databricks/databricks: failed to instantiate provider "registry.terraform.io/databricks/databricks" to obtain schema: Unrecognized remote plugin message:
This usually means that the plugin is either invalid or simply
needs to be recompiled to support the latest protocol.
time=2023-05-25T15:28:51.[41](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:42)3Z level=ERROR source=root.go:[96](https://github.com/hirayukis/databricks-study/actions/runs/5081761318/jobs/9130607805#step:6:97) msg="failed execution" exit_code=1 error="terraform apply: exit status 1\n\nError: failed to read schema for databricks_job.batch_inference_job in registry.terraform.io/databricks/databricks: failed to instantiate provider \"registry.terraform.io/databricks/databricks\" to obtain schema: Unrecognized remote plugin message: \n\nThis usually means that the plugin is either invalid or simply\nneeds to be recompiled to support the latest protocol.\n\n"
Error: Process completed with exit code 1.
Additionaly, I found smilar issues.
https://github.com/databricks/terraform-provider-databricks/issues/41
https://github.com/databricks/terraform-provider-databricks/issues/52
@hirayukis can you try running it again and see if you run into any more issues? We recently fixed something on our side to prevent this from happening
@arpitjasa-db Thanks for your response. I rerun this job, it was successful! Thank you so much, now I can utilize it for my pipeline!
@hirayukis glad to hear it!
I set up CI/CD configuration and try to run a pipeline.
However, it fails with the following error at the time of deployment.
Run databricks bundle deploy -e test --log-level DEBUG