databrickslabs / ucx

Automated migrations to Unity Catalog
Other
236 stars 80 forks source link

Test failure: `test_table_migration_job_refreshes_migration_status[regular-migrate-tables]` #1758

Closed github-actions[bot] closed 5 months ago

github-actions[bot] commented 5 months ago
❌ test_table_migration_job_refreshes_migration_status[regular-migrate-tables]: AssertionError: No migration statuses found (11m49.565s) ``` AssertionError: No migration statuses found assert 0 > 0 + where 0 = len([]) 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/AcCN 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.migrate_5uomj: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_5uomj', metastore_id=None, name='migrate_5uomj', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_ttwp6: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_ttwp6 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_ttwp6', metastore_id=None, name='ucx_ttwp6', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_5uomj/ucx_ttwp6', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_tvmvo: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_tvmvo 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_tvmvo', metastore_id=None, name='ucx_tvmvo', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_WHPM', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_tjoxc: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_tjoxc 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_tjoxc', metastore_id=None, name='ucx_tjoxc', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/AcCN', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_thyr0: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_thyr0 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_thyr0', metastore_id=None, name='ucx_thyr0', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_5uomj.ucx_ttwp6', view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_thfby: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_thfby 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_thfby', metastore_id=None, name='ucx_thfby', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_5uomj.ucx_thyr0', view_dependencies=None) 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716534209516, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cvkcn', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cvkcn', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716534209516, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema ucx_cvkcn.migrate_5uomj: https://DATABRICKS_HOST/explore/data/ucx_cvkcn/migrate_5uomj 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_cvkcn', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cvkcn.migrate_5uomj', metastore_id=None, name='migrate_5uomj', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_sbyde: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_sbyde 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_sbyde', metastore_id=None, name='ucx_sbyde', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) [gw5] linux -- Python 3.10.14 /home/runner/work/ucx/ucx/.venv/bin/python 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/AcCN 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.migrate_5uomj: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_5uomj', metastore_id=None, name='migrate_5uomj', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_ttwp6: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_ttwp6 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_ttwp6', metastore_id=None, name='ucx_ttwp6', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_5uomj/ucx_ttwp6', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_tvmvo: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_tvmvo 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_tvmvo', metastore_id=None, name='ucx_tvmvo', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_WHPM', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_tjoxc: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_tjoxc 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_tjoxc', metastore_id=None, name='ucx_tjoxc', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/AcCN', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_thyr0: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_thyr0 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_thyr0', metastore_id=None, name='ucx_thyr0', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_5uomj.ucx_ttwp6', view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_thfby: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_thfby 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_thfby', metastore_id=None, name='ucx_thfby', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_5uomj.ucx_thyr0', view_dependencies=None) 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716534209516, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cvkcn', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cvkcn', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716534209516, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema ucx_cvkcn.migrate_5uomj: https://DATABRICKS_HOST/explore/data/ucx_cvkcn/migrate_5uomj 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_cvkcn', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cvkcn.migrate_5uomj', metastore_id=None, name='migrate_5uomj', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_sbyde: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_sbyde 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_sbyde', metastore_id=None, name='ucx_sbyde', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Bjix/config.yml) doesn't exist. 07:03 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration 07:03 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data. 07:03 INFO [databricks.labs.ucx.install] Fetching installations... 07:03 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Bjix is corrupted. Skipping... 07:03 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy. 07:03 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:12 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:12 INFO [databricks.labs.ucx.install] Installing UCX v0.23.2+3420240524071221 07:12 INFO [databricks.labs.ucx.install] Creating ucx schemas... 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=experimental-workflow-linter 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups 07:12 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Bjix/README for the next steps. 07:12 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-tables job: https://DATABRICKS_HOST#job/680003538481989 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/AcCN 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.migrate_5uomj: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_5uomj', metastore_id=None, name='migrate_5uomj', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_ttwp6: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_ttwp6 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_ttwp6', metastore_id=None, name='ucx_ttwp6', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_5uomj/ucx_ttwp6', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_tvmvo: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_tvmvo 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_tvmvo', metastore_id=None, name='ucx_tvmvo', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_WHPM', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_tjoxc: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_tjoxc 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_tjoxc', metastore_id=None, name='ucx_tjoxc', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/AcCN', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_thyr0: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_thyr0 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_thyr0', metastore_id=None, name='ucx_thyr0', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_5uomj.ucx_ttwp6', view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_5uomj.ucx_thfby: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_5uomj/ucx_thfby 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_thfby', metastore_id=None, name='ucx_thfby', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_5uomj.ucx_thyr0', view_dependencies=None) 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716534209516, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cvkcn', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cvkcn', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716534209516, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema ucx_cvkcn.migrate_5uomj: https://DATABRICKS_HOST/explore/data/ucx_cvkcn/migrate_5uomj 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_cvkcn', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cvkcn.migrate_5uomj', metastore_id=None, name='migrate_5uomj', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_sbyde: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_sbyde 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_sbyde', metastore_id=None, name='ucx_sbyde', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Bjix/config.yml) doesn't exist. 07:03 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration 07:03 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data. 07:03 INFO [databricks.labs.ucx.install] Fetching installations... 07:03 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Bjix is corrupted. Skipping... 07:03 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy. 07:03 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:12 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:12 INFO [databricks.labs.ucx.install] Installing UCX v0.23.2+3420240524071221 07:12 INFO [databricks.labs.ucx.install] Creating ucx schemas... 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=experimental-workflow-linter 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups 07:12 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Bjix/README for the next steps. 07:12 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-tables job: https://DATABRICKS_HOST#job/680003538481989 07:14 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 make_storage_dir fixtures 07:14 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 1 make_dbfs_data_copy fixtures 07:14 DEBUG [databricks.labs.ucx.mixins.fixtures] removing make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/AcCN 07:14 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 cluster fixtures 07:14 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 1 catalog fixtures 07:14 DEBUG [databricks.labs.ucx.mixins.fixtures] removing catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716534209516, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cvkcn', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cvkcn', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716534209516, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:14 DEBUG [databricks.labs.ucx.mixins.fixtures] ignoring error while catalog CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716534209516, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cvkcn', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cvkcn', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716534209516, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') teardown: Catalog 'ucx_cvkcn' does not exist. 07:14 INFO [databricks.labs.ucx.install] Deleting UCX v0.23.2+3420240524071221 from https://DATABRICKS_HOST 07:14 INFO [databricks.labs.ucx.install] Deleting inventory database ucx_sbyde 07:15 INFO [databricks.labs.ucx.install] Deleting jobs 07:15 INFO [databricks.labs.ucx.install] Deleting scan-tables-in-mounts-experimental job_id=670941295351171. 07:15 INFO [databricks.labs.ucx.install] Deleting assessment job_id=593298695817616. 07:15 INFO [databricks.labs.ucx.install] Deleting failing job_id=779076396130646. 07:15 INFO [databricks.labs.ucx.install] Deleting migrate-external-tables-ctas job_id=70857046857409. 07:15 INFO [databricks.labs.ucx.install] Deleting experimental-workflow-linter job_id=408225218089696. 07:15 INFO [databricks.labs.ucx.install] Deleting migrate-groups job_id=1099409507444274. 07:15 INFO [databricks.labs.ucx.install] Deleting migrate-groups-experimental job_id=463758709817145. 07:15 INFO [databricks.labs.ucx.install] Deleting migrate-tables-in-mounts-experimental job_id=734606162781156. 07:15 INFO [databricks.labs.ucx.install] Deleting migrate-external-hiveserde-tables-in-place-experimental job_id=669624303554219. 07:15 INFO [databricks.labs.ucx.install] Deleting migrate-tables job_id=680003538481989. 07:15 INFO [databricks.labs.ucx.install] Deleting validate-groups-permissions job_id=79809875146642. 07:15 INFO [databricks.labs.ucx.install] Deleting remove-workspace-local-backup-groups job_id=929571631437303. 07:15 INFO [databricks.labs.ucx.install] Deleting cluster policy 07:15 ERROR [databricks.labs.ucx.install] UCX Policy already deleted 07:15 INFO [databricks.labs.ucx.install] Deleting secret scope 07:15 INFO [databricks.labs.ucx.install] UnInstalling UCX complete 07:15 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 workspace user fixtures 07:15 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 account group fixtures 07:15 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 workspace group fixtures 07:15 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 table fixtures 07:15 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 5 table fixtures 07:15 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_ttwp6', metastore_id=None, name='ucx_ttwp6', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_5uomj/ucx_ttwp6', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:15 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_tvmvo', metastore_id=None, name='ucx_tvmvo', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_WHPM', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:15 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_tjoxc', metastore_id=None, name='ucx_tjoxc', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/AcCN', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:15 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_thyr0', metastore_id=None, name='ucx_thyr0', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_5uomj.ucx_ttwp6', view_dependencies=None) 07:15 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_5uomj.ucx_thfby', metastore_id=None, name='ucx_thfby', owner=None, pipeline_id=None, properties=None, row_filter=None, schema_name='migrate_5uomj', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_5uomj.ucx_thyr0', view_dependencies=None) 07:15 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 3 schema fixtures 07:15 DEBUG [databricks.labs.ucx.mixins.fixtures] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_5uomj', metastore_id=None, name='migrate_5uomj', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:15 DEBUG [databricks.labs.ucx.mixins.fixtures] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_cvkcn', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cvkcn.migrate_5uomj', metastore_id=None, name='migrate_5uomj', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:15 DEBUG [databricks.labs.ucx.mixins.fixtures] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_sbyde', metastore_id=None, name='ucx_sbyde', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) [gw5] linux -- Python 3.10.14 /home/runner/work/ucx/ucx/.venv/bin/python ```

Running from nightly #70

github-actions[bot] commented 5 months ago
❌ test_table_migration_job_refreshes_migration_status[regular-migrate-tables]: TimeoutError: Timed out after 0:05:00 (15m11.924s) ``` TimeoutError: Timed out after 0:05:00 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/Tfu3 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.migrate_gknc2: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_gknc2', metastore_id=None, name='migrate_gknc2', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_tveso: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_tveso 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tveso', metastore_id=None, name='ucx_tveso', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_gknc2/ucx_tveso', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_ttt0w: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_ttt0w 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_ttt0w', metastore_id=None, name='ucx_ttt0w', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_kyHn', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_tozui: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_tozui 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tozui', metastore_id=None, name='ucx_tozui', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/Tfu3', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_tinih: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_tinih 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tinih', metastore_id=None, name='ucx_tinih', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_gknc2.ucx_tveso', view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_tvfd6: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_tvfd6 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tvfd6', metastore_id=None, name='ucx_tvfd6', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_gknc2.ucx_tinih', view_dependencies=None) 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716879810444, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cxjcm', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cxjcm', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716879810444, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema ucx_cxjcm.migrate_gknc2: https://DATABRICKS_HOST/explore/data/ucx_cxjcm/migrate_gknc2 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_cxjcm', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cxjcm.migrate_gknc2', metastore_id=None, name='migrate_gknc2', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_snt1f: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_snt1f 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_snt1f', metastore_id=None, name='ucx_snt1f', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) [gw6] linux -- Python 3.10.14 /home/runner/work/ucx/ucx/.venv/bin/python 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/Tfu3 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.migrate_gknc2: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_gknc2', metastore_id=None, name='migrate_gknc2', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_tveso: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_tveso 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tveso', metastore_id=None, name='ucx_tveso', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_gknc2/ucx_tveso', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_ttt0w: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_ttt0w 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_ttt0w', metastore_id=None, name='ucx_ttt0w', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_kyHn', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_tozui: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_tozui 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tozui', metastore_id=None, name='ucx_tozui', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/Tfu3', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_tinih: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_tinih 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tinih', metastore_id=None, name='ucx_tinih', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_gknc2.ucx_tveso', view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_tvfd6: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_tvfd6 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tvfd6', metastore_id=None, name='ucx_tvfd6', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_gknc2.ucx_tinih', view_dependencies=None) 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716879810444, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cxjcm', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cxjcm', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716879810444, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema ucx_cxjcm.migrate_gknc2: https://DATABRICKS_HOST/explore/data/ucx_cxjcm/migrate_gknc2 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_cxjcm', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cxjcm.migrate_gknc2', metastore_id=None, name='migrate_gknc2', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_snt1f: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_snt1f 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_snt1f', metastore_id=None, name='ucx_snt1f', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CUkT/config.yml) doesn't exist. 07:03 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration 07:03 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data. 07:03 INFO [databricks.labs.ucx.install] Fetching installations... 07:03 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CUkT is corrupted. Skipping... 07:03 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy. 07:03 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:13 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:13 INFO [databricks.labs.ucx.install] Installing UCX v0.24.1+220240528071351 07:13 INFO [databricks.labs.ucx.install] Creating ucx schemas... 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=experimental-workflow-linter 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental 07:14 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CUkT/README for the next steps. 07:14 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-tables job: https://DATABRICKS_HOST#job/475113542578518 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/Tfu3 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.migrate_gknc2: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_gknc2', metastore_id=None, name='migrate_gknc2', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_tveso: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_tveso 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tveso', metastore_id=None, name='ucx_tveso', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_gknc2/ucx_tveso', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_ttt0w: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_ttt0w 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_ttt0w', metastore_id=None, name='ucx_ttt0w', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_kyHn', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_tozui: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_tozui 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tozui', metastore_id=None, name='ucx_tozui', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/Tfu3', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_tinih: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_tinih 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tinih', metastore_id=None, name='ucx_tinih', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_gknc2.ucx_tveso', view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_gknc2.ucx_tvfd6: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_gknc2/ucx_tvfd6 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tvfd6', metastore_id=None, name='ucx_tvfd6', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_gknc2.ucx_tinih', view_dependencies=None) 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716879810444, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cxjcm', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cxjcm', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716879810444, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema ucx_cxjcm.migrate_gknc2: https://DATABRICKS_HOST/explore/data/ucx_cxjcm/migrate_gknc2 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_cxjcm', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cxjcm.migrate_gknc2', metastore_id=None, name='migrate_gknc2', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_snt1f: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_snt1f 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_snt1f', metastore_id=None, name='ucx_snt1f', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CUkT/config.yml) doesn't exist. 07:03 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration 07:03 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data. 07:03 INFO [databricks.labs.ucx.install] Fetching installations... 07:03 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CUkT is corrupted. Skipping... 07:03 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy. 07:03 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:13 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:13 INFO [databricks.labs.ucx.install] Installing UCX v0.24.1+220240528071351 07:13 INFO [databricks.labs.ucx.install] Creating ucx schemas... 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=experimental-workflow-linter 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas 07:14 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental 07:14 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.CUkT/README for the next steps. 07:14 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-tables job: https://DATABRICKS_HOST#job/475113542578518 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 make_storage_dir fixtures 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 1 make_dbfs_data_copy fixtures 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] removing make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/Tfu3 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 cluster fixtures 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 1 catalog fixtures 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] removing catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716879810444, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cxjcm', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cxjcm', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716879810444, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] ignoring error while catalog CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716879810444, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cxjcm', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cxjcm', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716879810444, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') teardown: Catalog 'ucx_cxjcm' does not exist. 07:18 INFO [databricks.labs.ucx.install] Deleting UCX v0.24.1+220240528071351 from https://DATABRICKS_HOST 07:18 INFO [databricks.labs.ucx.install] Deleting inventory database ucx_snt1f 07:18 INFO [databricks.labs.ucx.install] Deleting jobs 07:18 INFO [databricks.labs.ucx.install] Deleting migrate-tables job_id=475113542578518. 07:18 INFO [databricks.labs.ucx.install] Deleting remove-workspace-local-backup-groups job_id=433538973573675. 07:18 INFO [databricks.labs.ucx.install] Deleting experimental-workflow-linter job_id=266063595735836. 07:18 INFO [databricks.labs.ucx.install] Deleting migrate-data-reconciliation job_id=457337416506276. 07:18 INFO [databricks.labs.ucx.install] Deleting assessment job_id=1033182039975373. 07:18 INFO [databricks.labs.ucx.install] Deleting migrate-tables-in-mounts-experimental job_id=1058593181788079. 07:18 INFO [databricks.labs.ucx.install] Deleting migrate-groups job_id=516390867353513. 07:18 INFO [databricks.labs.ucx.install] Deleting failing job_id=545417343379431. 07:18 INFO [databricks.labs.ucx.install] Deleting validate-groups-permissions job_id=408697137370809. 07:18 INFO [databricks.labs.ucx.install] Deleting migrate-external-hiveserde-tables-in-place-experimental job_id=247472386583768. 07:18 INFO [databricks.labs.ucx.install] Deleting scan-tables-in-mounts-experimental job_id=284476640916771. 07:18 INFO [databricks.labs.ucx.install] Deleting migrate-external-tables-ctas job_id=922131577793587. 07:18 INFO [databricks.labs.ucx.install] Deleting migrate-groups-experimental job_id=462693950985443. 07:18 INFO [databricks.labs.ucx.install] Deleting cluster policy 07:18 ERROR [databricks.labs.ucx.install] UCX Policy already deleted 07:18 INFO [databricks.labs.ucx.install] Deleting secret scope 07:18 INFO [databricks.labs.ucx.install] UnInstalling UCX complete 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 workspace user fixtures 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 account group fixtures 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 workspace group fixtures 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 table fixtures 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 5 table fixtures 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tveso', metastore_id=None, name='ucx_tveso', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_gknc2/ucx_tveso', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_ttt0w', metastore_id=None, name='ucx_ttt0w', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_kyHn', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tozui', metastore_id=None, name='ucx_tozui', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/Tfu3', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tinih', metastore_id=None, name='ucx_tinih', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_gknc2.ucx_tveso', view_dependencies=None) 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_gknc2.ucx_tvfd6', metastore_id=None, name='ucx_tvfd6', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052808'}, row_filter=None, schema_name='migrate_gknc2', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_gknc2.ucx_tinih', view_dependencies=None) 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 3 schema fixtures 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_gknc2', metastore_id=None, name='migrate_gknc2', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_cxjcm', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cxjcm.migrate_gknc2', metastore_id=None, name='migrate_gknc2', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:18 DEBUG [databricks.labs.ucx.mixins.fixtures] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_snt1f', metastore_id=None, name='ucx_snt1f', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) [gw6] linux -- Python 3.10.14 /home/runner/work/ucx/ucx/.venv/bin/python ```

Running from nightly #74

ericvergnaud commented 5 months ago

This seems to be caused by the watchdog tampering the backend. Tentative fix is in progress, see https://github.com/databrickslabs/watchdog/pull/39

github-actions[bot] commented 5 months ago
❌ test_table_migration_job_refreshes_migration_status[regular-migrate-tables]: databricks.labs.blueprint.parallel.ManyError: Detected 4 failures: Unknown: migrate_dbfs_root_delta_tables: Py4JError: An error occurred while calling o431.mounts. Trace: (14m17.558s) ``` databricks.labs.blueprint.parallel.ManyError: Detected 4 failures: Unknown: migrate_dbfs_root_delta_tables: Py4JError: An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750), Unknown: migrate_dbfs_root_non_delta_tables: Py4JError: An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750), Unknown: migrate_external_tables_sync: Py4JError: An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750), Unknown: parse_logs: run failed with error message Cluster 'DATABRICKS_CLUSTER_ID' was terminated. Reason: USER_REQUEST (SUCCESS). Parameters: username:0a330eb5-dd51-4d97-b6e4-c474356b1d5d. 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/ceyw 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.migrate_jnvkp: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_jnvkp', metastore_id=None, name='migrate_jnvkp', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_tpi07: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_tpi07 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_tpi07', metastore_id=None, name='ucx_tpi07', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_jnvkp/ucx_tpi07', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_t3h9e: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_t3h9e 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_t3h9e', metastore_id=None, name='ucx_t3h9e', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_gbYm', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_twraa: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_twraa 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_twraa', metastore_id=None, name='ucx_twraa', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/ceyw', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_t14l3: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_t14l3 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_t14l3', metastore_id=None, name='ucx_t14l3', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_jnvkp.ucx_tpi07', view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_t7mog: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_t7mog 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_t7mog', metastore_id=None, name='ucx_t7mog', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_jnvkp.ucx_t14l3', view_dependencies=None) 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716966219661, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_csp3b', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_csp3b', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716966219661, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema ucx_csp3b.migrate_jnvkp: https://DATABRICKS_HOST/explore/data/ucx_csp3b/migrate_jnvkp 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_csp3b', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_csp3b.migrate_jnvkp', metastore_id=None, name='migrate_jnvkp', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_soysi: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_soysi 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_soysi', metastore_id=None, name='ucx_soysi', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) [gw4] linux -- Python 3.10.14 /home/runner/work/ucx/ucx/.venv/bin/python 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/ceyw 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.migrate_jnvkp: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_jnvkp', metastore_id=None, name='migrate_jnvkp', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_tpi07: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_tpi07 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_tpi07', metastore_id=None, name='ucx_tpi07', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_jnvkp/ucx_tpi07', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_t3h9e: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_t3h9e 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_t3h9e', metastore_id=None, name='ucx_t3h9e', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_gbYm', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_twraa: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_twraa 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_twraa', metastore_id=None, name='ucx_twraa', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/ceyw', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_t14l3: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_t14l3 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_t14l3', metastore_id=None, name='ucx_t14l3', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_jnvkp.ucx_tpi07', view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_t7mog: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_t7mog 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_t7mog', metastore_id=None, name='ucx_t7mog', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_jnvkp.ucx_t14l3', view_dependencies=None) 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716966219661, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_csp3b', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_csp3b', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716966219661, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema ucx_csp3b.migrate_jnvkp: https://DATABRICKS_HOST/explore/data/ucx_csp3b/migrate_jnvkp 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_csp3b', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_csp3b.migrate_jnvkp', metastore_id=None, name='migrate_jnvkp', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_soysi: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_soysi 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_soysi', metastore_id=None, name='ucx_soysi', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/config.yml) doesn't exist. 07:03 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration 07:03 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data. 07:03 INFO [databricks.labs.ucx.install] Fetching installations... 07:04 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI is corrupted. Skipping... 07:04 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy. 07:04 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:12 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:12 INFO [databricks.labs.ucx.install] Installing UCX v0.24.1+520240529071241 07:12 INFO [databricks.labs.ucx.install] Creating ucx schemas... 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=experimental-workflow-linter 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation 07:12 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/README for the next steps. 07:12 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-tables job: https://DATABRICKS_HOST#job/353371736950474 07:17 INFO [databricks.labs.ucx.installer.workflows] ---------- REMOTE LOGS -------------- 07:17 INFO [databricks.labs.ucx:migrate_external_tables_sync] UCX v0.24.1+520240529071241 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/logs/migrate-tables/run-786487235418832-0/migrate_external_tables_sync.log 07:17 DEBUG [databricks.sdk:migrate_external_tables_sync] GET /api/2.0/clusters/list < 200 OK < { < "clusters": [ < { < "autotermination_minutes": 60, < "CLOUD_ENV_attributes": { < "availability": "SPOT_WITH_FALLBACK_AZURE", < "first_on_demand": 2147483647, < "spot_bid_max_price": -1.0 < }, < "cluster_cores": 8.0, < "cluster_id": "TEST_EXT_HMS_CLUSTER_ID", < "cluster_memory_mb": 32768, < "cluster_name": "External Metastore", < "cluster_source": "UI", < "creator_user_name": "serge.smertin@databricks.com", < "data_security_mode": "USER_ISOLATION", < "TEST_SCHEMA_tags": { < "Budget": "opex.sales.labs", < "ClusterId": "TEST_EXT_HMS_CLUSTER_ID", < "ClusterName": "External Metastore", < "Creator": "serge.smertin@databricks.com", < "DatabricksInstanceGroupId": "-6693343645136663331", < "DatabricksInstancePoolCreatorId": "4183391249163402", < "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID", < "Owner": "labs-oss@databricks.com", < "Vendor": "Databricks" < }, < "disk_spec": {}, < "driver": { < "host_private_ip": "10.139.0.17", < "instance_id": "416a094689384e9b8eb4ac2d90434719", < "node_attributes": { < "is_spot": false < }, < "node_id": "081f050584b74828aa502ace88dce94f", < "private_ip": "10.139.64.17", < "public_dns": "104.209.188.220", < "start_timestamp": 1716966580624 < }, < "driver_healthy": true, < "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID", < "driver_instance_source": { < "instance_pool_id": "TEST_INSTANCE_POOL_ID" < }, < "driver_node_type_id": "Standard_D4s_v3", < "effective_spark_version": "15.1.x-scala2.12", < "enable_elastic_disk": true, < "enable_local_disk_encryption": false, < "executors": [ < { < "host_private_ip": "10.139.0.24", < "instance_id": "e8b9112441f04b02a29889d7a0b4b261", < "node_attributes": { < "is_spot": false < }, < "node_id": "9e0ed3e7b912431e84b9b9edb2651ae0", < "private_ip": "10.139.64.24", < "public_dns": "104.209.177.152", < "start_timestamp": 1716966580581 < } < ], < "init_scripts_safe_mode": false, < "instance_pool_id": "TEST_INSTANCE_POOL_ID", < "instance_source": { < "instance_pool_id": "TEST_INSTANCE_POOL_ID" < }, < "jdbc_port": 10000, < "last_activity_time": 1716966783501, < "last_restarted_time": 1716966763471, < "last_state_loss_time": 1716966763410, < "node_type_id": "Standard_D4s_v3", < "num_workers": 1, < "pinned_by_user_name": "4183391249163402", < "policy_id": "000E138775A879A0", < "spark_conf": { < "datanucleus.autoCreateSchema": "true", < "datanucleus.fixedDatastore": "true", < "spark.hadoop.javax.jdo.option.ConnectionDriverName": "com.microsoft.sqlserver.jdbc.SQLServerDriver", < "spark.hadoop.javax.jdo.option.ConnectionPassword": "{{secrets/external_metastore/external_metastore_password}}", < "spark.hadoop.javax.jdo.option.ConnectionURL": "{{secrets/external_metastore/external_metastore_url}}", < "spark.hadoop.javax.jdo.option.ConnectionUserName": "{{secrets/external_metastore/external_metastore_user}}", < "spark.hadoop.metastore.catalog.TEST_SCHEMA": "hive", < "spark.sql.hive.metastore.jars": "maven", < "spark.sql.hive.metastore.version": "3.1.0" < }, < "spark_context_id": 7713603744641915067, < "spark_version": "15.1.x-scala2.12", < "start_time": 1713528456326, < "state": "RUNNING", < "state_message": "" < }, < "... (8 additional elements)" < ] < } 07:17 DEBUG [databricks.sdk:migrate_external_tables_sync] GET /api/2.1/unity-catalog/external-locations < 200 OK < { < "external_locations": [ < { < "created_at": 1712317369786, < "created_by": "serge.smertin@databricks.com", < "credential_id": "462bd121-a3ff-4f51-899f-236868f3d2ab", < "credential_name": "TEST_STORAGE_CREDENTIAL", < "full_name": "TEST_A_LOCATION", < "id": "98c25265-fb0f-4b15-a727-63855b7f78a7", < "isolation_mode": "ISOLATION_MODE_OPEN", < "metastore_id": "8952c1e3-b265-4adf-98c3-6f755e2e1453", < "name": "TEST_A_LOCATION", < "owner": "labs.scope.account-admin", < "read_only": false, < "securable_kind": "EXTERNAL_LOCATION_STANDARD", < "securable_type": "EXTERNAL_LOCATION", < "updated_at": 1712566812808, < "updated_by": "serge.smertin@databricks.com", < "url": "TEST_MOUNT_CONTAINER/a" < }, < "... (3 additional elements)" < ] < } 07:17 DEBUG [databricks.labs.blueprint.installation:migrate_external_tables_sync] Loading list from CLOUD_ENV_storage_account_info.csv 07:17 DEBUG [databricks.sdk:migrate_external_tables_sync] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/CLOUD_ENV_storage_account_info.csv&direct_download=true < 404 Not Found < { < "error_code": "RESOURCE_DOES_NOT_EXIST", < "message": "Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/CLOUD_ENV_storage_account_info.csv) doesn't ... (6 more bytes)" < } 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.grants] fetching grants inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.grants] crawling new batch for grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.tables] fetching tables inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.tables 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.tables] crawling new batch for tables 07:17 DEBUG [databricks.labs.ucx.hive_metastore.tables:migrate_external_tables_sync] [hive_metastore.migrate_jnvkp] listing tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SHOW TABLES FROM hive_metastore.migrate_jnvkp 07:17 WARNING [databricks.labs.ucx.hive_metastore.tables:migrate_external_tables_sync] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.tables] found 0 new records for tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.tables (catalog STRING NOT NULL, database ST... (222 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.udfs] fetching udfs inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.udfs 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.udfs] crawling new batch for udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][execute] USE CATALOG hive_metastore; 07:17 DEBUG [databricks.labs.ucx.hive_metastore.udfs:migrate_external_tables_sync] [hive_metastore.migrate_jnvkp] listing udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SHOW USER FUNCTIONS FROM hive_metastore.migrate_jnvkp; 07:17 WARNING [databricks.labs.ucx.hive_metastore.udfs:migrate_external_tables_sync] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.udfs] found 0 new records for udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.udfs (catalog STRING NOT NULL, database STRI... (288 more bytes) 07:17 DEBUG [databricks.labs.blueprint.parallel:migrate_external_tables_sync] Starting 4 tasks in 8 threads 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SHOW GRANTS ON CATALOG hive_metastore 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SHOW GRANTS ON ANY FILE 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SHOW GRANTS ON ANONYMOUS FUNCTION 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SHOW GRANTS ON DATABASE hive_metastore.migrate_jnvkp 07:17 ERROR [databricks.labs.ucx.hive_metastore.grants:migrate_external_tables_sync] Couldn't fetch grants for object DATABASE hive_metastore.migrate_jnvkp: (org.apache.spark.SparkSecurityException) Database(migrate_jnvkp,Some(hive_metastore)) does not exist. JVM stacktrace: org.apache.spark.SparkSecurityException at com.databricks.sql.acl.AclCommand.$anonfun$mapIfExists$1(commands.scala:79) at scala.Option.getOrElse(Option.scala:189) at com.databricks.sql.acl.AclCommand.mapIfExists(commands.scala:79) at com.databricks.sql.acl.AclCommand.mapIfExists$(commands.scala:75) at com.databricks.sql.acl.ShowPermissionsCommand.mapIfExists(commands.scala:226) at com.databricks.sql.acl.ShowPermissionsCommand.run(commands.scala:244) at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$2(commands.scala:84) at org.apache.spark.sql.execution.SparkPlan.runCommandWithAetherOff(SparkPlan.scala:180) at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:191) at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:84) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:81) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:80) at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:94) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$5(QueryExecution.scala:375) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$4(QueryExecution.scala:375) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:166) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:375) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$9(SQLExecution.scala:386) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:669) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:275) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:162) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:606) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:371) at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1098) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:367) at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:318) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:364) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:340) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:477) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:83) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:477) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:343) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:339) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:453) at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:340) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:400) at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:340) at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:277) at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:274) at org.apache.spark.sql.Dataset.(Dataset.scala:310) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:131) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1182) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1182) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:122) at org.apache.spark.sql.SparkSession.$anonfun$sql$4(SparkSession.scala:954) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:942) at org.apache.spark.sql.connect.planner.SparkConnectPlanner.handleSqlCommand(SparkConnectPlanner.scala:2700) at org.apache.spark.sql.connect.planner.SparkConnectPlanner.process(SparkConnectPlanner.scala:2653) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.handleCommand(ExecuteThreadRunner.scala:307) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1(ExecuteThreadRunner.scala:233) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1$adapted(ExecuteThreadRunner.scala:169) at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$2(SessionHolder.scala:350) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$1(SessionHolder.scala:350) at org.apache.spark.JobArtifactSet$.withActiveJobArtifactState(JobArtifactSet.scala:97) at org.apache.spark.sql.artifact.ArtifactManager.$anonfun$withResources$1(ArtifactManager.scala:84) at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:234) at org.apache.spark.sql.artifact.ArtifactManager.withResources(ArtifactManager.scala:83) at org.apache.spark.sql.connect.service.SessionHolder.withSession(SessionHolder.scala:349) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.executeInternal(ExecuteThreadRunner.scala:169) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.org$apache$spark$sql$connect$execution$ExecuteThreadRunner$$execute(ExecuteThreadRunner.scala:119) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.$anonfun$run$1(ExecuteThreadRunner.scala:511) at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45) at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:103) at com.databricks.unity.HandleImpl.$anonfun$runWithAndClose$1(UCSHandle.scala:108) at scala.util.Using$.resource(Using.scala:269) at com.databricks.unity.HandleImpl.runWithAndClose(UCSHandle.scala:107) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.run(ExecuteThreadRunner.scala:510) 07:17 INFO [databricks.labs.ucx:migrate_dbfs_root_non_delta_tables] UCX v0.24.1+520240529071241 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/logs/migrate-tables/run-786487235418832-0/migrate_dbfs_root_non_delta_tables.log 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_non_delta_tables] GET /api/2.0/clusters/list < 200 OK < { < "clusters": [ < { < "autotermination_minutes": 60, < "CLOUD_ENV_attributes": { < "availability": "SPOT_WITH_FALLBACK_AZURE", < "first_on_demand": 2147483647, < "spot_bid_max_price": -1.0 < }, < "cluster_cores": 8.0, < "cluster_id": "TEST_EXT_HMS_CLUSTER_ID", < "cluster_memory_mb": 32768, < "cluster_name": "External Metastore", < "cluster_source": "UI", < "creator_user_name": "serge.smertin@databricks.com", < "data_security_mode": "USER_ISOLATION", < "TEST_SCHEMA_tags": { < "Budget": "opex.sales.labs", < "ClusterId": "TEST_EXT_HMS_CLUSTER_ID", < "ClusterName": "External Metastore", < "Creator": "serge.smertin@databricks.com", < "DatabricksInstanceGroupId": "-6693343645136663331", < "DatabricksInstancePoolCreatorId": "4183391249163402", < "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID", < "Owner": "labs-oss@databricks.com", < "Vendor": "Databricks" < }, < "disk_spec": {}, < "driver": { < "host_private_ip": "10.139.0.17", < "instance_id": "416a094689384e9b8eb4ac2d90434719", < "node_attributes": { < "is_spot": false < }, < "node_id": "081f050584b74828aa502ace88dce94f", < "private_ip": "10.139.64.17", < "public_dns": "104.209.188.220", < "start_timestamp": 1716966580624 < }, < "driver_healthy": true, < "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID", < "driver_instance_source": { < "instance_pool_id": "TEST_INSTANCE_POOL_ID" < }, < "driver_node_type_id": "Standard_D4s_v3", < "effective_spark_version": "15.1.x-scala2.12", < "enable_elastic_disk": true, < "enable_local_disk_encryption": false, < "executors": [ < { < "host_private_ip": "10.139.0.24", < "instance_id": "e8b9112441f04b02a29889d7a0b4b261", < "node_attributes": { < "is_spot": false < }, < "node_id": "9e0ed3e7b912431e84b9b9edb2651ae0", < "private_ip": "10.139.64.24", < "public_dns": "104.209.177.152", < "start_timestamp": 1716966580581 < } < ], < "init_scripts_safe_mode": false, < "instance_pool_id": "TEST_INSTANCE_POOL_ID", < "instance_source": { < "instance_pool_id": "TEST_INSTANCE_POOL_ID" < }, < "jdbc_port": 10000, < "last_activity_time": 1716966783501, < "last_restarted_time": 1716966763471, < "last_state_loss_time": 1716966763410, < "node_type_id": "Standard_D4s_v3", < "num_workers": 1, < "pinned_by_user_name": "4183391249163402", < "policy_id": "000E138775A879A0", < "spark_conf": { < "datanucleus.autoCreateSchema": "true", < "datanucleus.fixedDatastore": "true", < "spark.hadoop.javax.jdo.option.ConnectionDriverName": "com.microsoft.sqlserver.jdbc.SQLServerDriver", < "spark.hadoop.javax.jdo.option.ConnectionPassword": "{{secrets/external_metastore/external_metastore_password}}", < "spark.hadoop.javax.jdo.option.ConnectionURL": "{{secrets/external_metastore/external_metastore_url}}", < "spark.hadoop.javax.jdo.option.ConnectionUserName": "{{secrets/external_metastore/external_metastore_user}}", < "spark.hadoop.metastore.catalog.TEST_SCHEMA": "hive", < "spark.sql.hive.metastore.jars": "maven", < "spark.sql.hive.metastore.version": "3.1.0" < }, < "spark_context_id": 7713603744641915067, < "spark_version": "15.1.x-scala2.12", < "start_time": 1713528456326, < "state": "RUNNING", < "state_message": "" < }, < "... (8 additional elements)" < ] < } 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_non_delta_tables] GET /api/2.1/unity-catalog/external-locations < 200 OK < { < "external_locations": [ < { < "created_at": 1712317369786, < "created_by": "serge.smertin@databricks.com", < "credential_id": "462bd121-a3ff-4f51-899f-236868f3d2ab", < "credential_name": "TEST_STORAGE_CREDENTIAL", < "full_name": "TEST_A_LOCATION", < "id": "98c25265-fb0f-4b15-a727-63855b7f78a7", < "isolation_mode": "ISOLATION_MODE_OPEN", < "metastore_id": "8952c1e3-b265-4adf-98c3-6f755e2e1453", < "name": "TEST_A_LOCATION", < "owner": "labs.scope.account-admin", < "read_only": false, < "securable_kind": "EXTERNAL_LOCATION_STANDARD", < "securable_type": "EXTERNAL_LOCATION", < "updated_at": 1712566812808, < "updated_by": "serge.smertin@databricks.com", < "url": "TEST_MOUNT_CONTAINER/a" < }, < "... (3 additional elements)" < ] < } 07:17 DEBUG [databricks.labs.blueprint.installation:migrate_dbfs_root_non_delta_tables] Loading list from CLOUD_ENV_storage_account_info.csv 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_non_delta_tables] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/CLOUD_ENV_storage_account_info.csv&direct_download=true < 404 Not Found < { < "error_code": "RESOURCE_DOES_NOT_EXIST", < "message": "Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/CLOUD_ENV_storage_account_info.csv) doesn't ... (6 more bytes)" < } 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.grants] fetching grants inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.grants] crawling new batch for grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.tables] fetching tables inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.tables 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.tables] crawling new batch for tables 07:17 DEBUG [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_non_delta_tables] [hive_metastore.migrate_jnvkp] listing tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SHOW TABLES FROM hive_metastore.migrate_jnvkp 07:17 WARNING [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_non_delta_tables] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.tables] found 0 new records for tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.tables (catalog STRING NOT NULL, database ST... (222 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.udfs] fetching udfs inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.udfs 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.udfs] crawling new batch for udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][execute] USE CATALOG hive_metastore; 07:17 DEBUG [databricks.labs.ucx.hive_metastore.udfs:migrate_dbfs_root_non_delta_tables] [hive_metastore.migrate_jnvkp] listing udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SHOW USER FUNCTIONS FROM hive_metastore.migrate_jnvkp; 07:17 WARNING [databricks.labs.ucx.hive_metastore.udfs:migrate_dbfs_root_non_delta_tables] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.udfs] found 0 new records for udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.udfs (catalog STRING NOT NULL, database STRI... (288 more bytes) 07:17 DEBUG [databricks.labs.blueprint.parallel:migrate_dbfs_root_non_delta_tables] Starting 4 tasks in 8 threads 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SHOW GRANTS ON CATALOG hive_metastore 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SHOW GRANTS ON ANY FILE 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SHOW GRANTS ON DATABASE hive_metastore.migrate_jnvkp 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SHOW GRANTS ON ANONYMOUS FUNCTION 07:17 ERROR [databricks.labs.ucx.hive_metastore.grants:migrate_dbfs_root_non_delta_tables] Couldn't fetch grants for object DATABASE hive_metastore.migrate_jnvkp: (org.apache.spark.SparkSecurityException) Database(migrate_jnvkp,Some(hive_metastore)) does not exist. JVM stacktrace: org.apache.spark.SparkSecurityException at com.databricks.sql.acl.AclCommand.$anonfun$mapIfExists$1(commands.scala:79) at scala.Option.getOrElse(Option.scala:189) at com.databricks.sql.acl.AclCommand.mapIfExists(commands.scala:79) at com.databricks.sql.acl.AclCommand.mapIfExists$(commands.scala:75) at com.databricks.sql.acl.ShowPermissionsCommand.mapIfExists(commands.scala:226) at com.databricks.sql.acl.ShowPermissionsCommand.run(commands.scala:244) at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$2(commands.scala:84) at org.apache.spark.sql.execution.SparkPlan.runCommandWithAetherOff(SparkPlan.scala:180) at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:191) at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:84) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:81) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:80) at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:94) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$5(QueryExecution.scala:375) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$4(QueryExecution.scala:375) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:166) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:375) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$9(SQLExecution.scala:386) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:669) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:275) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:162) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:606) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:371) at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1098) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:367) at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:318) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:364) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:340) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:477) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:83) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:477) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:343) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:339) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:453) at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:340) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:400) at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:340) at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:277) at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:274) at org.apache.spark.sql.Dataset.(Dataset.scala:310) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:131) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1182) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1182) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:122) at org.apache.spark.sql.SparkSession.$anonfun$sql$4(SparkSession.scala:954) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:942) at org.apache.spark.sql.connect.planner.SparkConnectPlanner.handleSqlCommand(SparkConnectPlanner.scala:2700) at org.apache.spark.sql.connect.planner.SparkConnectPlanner.process(SparkConnectPlanner.scala:2653) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.handleCommand(ExecuteThreadRunner.scala:307) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1(ExecuteThreadRunner.scala:233) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1$adapted(ExecuteThreadRunner.scala:169) at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$2(SessionHolder.scala:350) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$1(SessionHolder.scala:350) at org.apache.spark.JobArtifactSet$.withActiveJobArtifactState(JobArtifactSet.scala:97) at org.apache.spark.sql.artifact.ArtifactManager.$anonfun$withResources$1(ArtifactManager.scala:84) at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:234) at org.apache.spark.sql.artifact.ArtifactManager.withResources(ArtifactManager.scala:83) at org.apache.spark.sql.connect.service.SessionHolder.withSession(SessionHolder.scala:349) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.executeInternal(ExecuteThreadRunner.scala:169) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.org$apache$spark$sql$connect$execution$ExecuteThreadRunner$$execute(ExecuteThreadRunner.scala:119) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.$anonfun$run$1(ExecuteThreadRunner.scala:511) at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45) at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:103) at com.databricks.unity.HandleImpl.$anonfun$runWithAndClose$1(UCSHandle.scala:108) at scala.util.Using$.resource(Using.scala:269) at com.databricks.unity.HandleImpl.runWithAndClose(UCSHandle.scala:107) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.run(ExecuteThreadRunner.scala:510) 07:17 INFO [databricks.labs.ucx:migrate_dbfs_root_delta_tables] UCX v0.24.1+520240529071241 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/logs/migrate-tables/run-786487235418832-0/migrate_dbfs_root_delta_tables.log 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_delta_tables] GET /api/2.0/clusters/list < 200 OK < { < "clusters": [ < { < "autotermination_minutes": 60, < "CLOUD_ENV_attributes": { < "availability": "SPOT_WITH_FALLBACK_AZURE", < "first_on_demand": 2147483647, < "spot_bid_max_price": -1.0 < }, < "cluster_cores": 8.0, < "cluster_id": "TEST_EXT_HMS_CLUSTER_ID", < "cluster_memory_mb": 32768, < "cluster_name": "External Metastore", < "cluster_source": "UI", < "creator_user_name": "serge.smertin@databricks.com", < "data_security_mode": "USER_ISOLATION", < "TEST_SCHEMA_tags": { < "Budget": "opex.sales.labs", < "ClusterId": "TEST_EXT_HMS_CLUSTER_ID", < "ClusterName": "External Metastore", < "Creator": "serge.smertin@databricks.com", < "DatabricksInstanceGroupId": "-6693343645136663331", < "DatabricksInstancePoolCreatorId": "4183391249163402", < "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID", < "Owner": "labs-oss@databricks.com", < "Vendor": "Databricks" < }, < "disk_spec": {}, < "driver": { < "host_private_ip": "10.139.0.17", < "instance_id": "416a094689384e9b8eb4ac2d90434719", < "node_attributes": { < "is_spot": false < }, < "node_id": "081f050584b74828aa502ace88dce94f", < "private_ip": "10.139.64.17", < "public_dns": "104.209.188.220", < "start_timestamp": 1716966580624 < }, < "driver_healthy": true, < "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID", < "driver_instance_source": { < "instance_pool_id": "TEST_INSTANCE_POOL_ID" < }, < "driver_node_type_id": "Standard_D4s_v3", < "effective_spark_version": "15.1.x-scala2.12", < "enable_elastic_disk": true, < "enable_local_disk_encryption": false, < "executors": [ < { < "host_private_ip": "10.139.0.24", < "instance_id": "e8b9112441f04b02a29889d7a0b4b261", < "node_attributes": { < "is_spot": false < }, < "node_id": "9e0ed3e7b912431e84b9b9edb2651ae0", < "private_ip": "10.139.64.24", < "public_dns": "104.209.177.152", < "start_timestamp": 1716966580581 < } < ], < "init_scripts_safe_mode": false, < "instance_pool_id": "TEST_INSTANCE_POOL_ID", < "instance_source": { < "instance_pool_id": "TEST_INSTANCE_POOL_ID" < }, < "jdbc_port": 10000, < "last_activity_time": 1716966783501, < "last_restarted_time": 1716966763471, < "last_state_loss_time": 1716966763410, < "node_type_id": "Standard_D4s_v3", < "num_workers": 1, < "pinned_by_user_name": "4183391249163402", < "policy_id": "000E138775A879A0", < "spark_conf": { < "datanucleus.autoCreateSchema": "true", < "datanucleus.fixedDatastore": "true", < "spark.hadoop.javax.jdo.option.ConnectionDriverName": "com.microsoft.sqlserver.jdbc.SQLServerDriver", < "spark.hadoop.javax.jdo.option.ConnectionPassword": "{{secrets/external_metastore/external_metastore_password}}", < "spark.hadoop.javax.jdo.option.ConnectionURL": "{{secrets/external_metastore/external_metastore_url}}", < "spark.hadoop.javax.jdo.option.ConnectionUserName": "{{secrets/external_metastore/external_metastore_user}}", < "spark.hadoop.metastore.catalog.TEST_SCHEMA": "hive", < "spark.sql.hive.metastore.jars": "maven", < "spark.sql.hive.metastore.version": "3.1.0" < }, < "spark_context_id": 7713603744641915067, < "spark_version": "15.1.x-scala2.12", < "start_time": 1713528456326, < "state": "RUNNING", < "state_message": "" < }, < "... (8 additional elements)" < ] < } 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_delta_tables] GET /api/2.1/unity-catalog/external-locations < 200 OK < { < "external_locations": [ < { < "created_at": 1712317369786, < "created_by": "serge.smertin@databricks.com", < "credential_id": "462bd121-a3ff-4f51-899f-236868f3d2ab", < "credential_name": "TEST_STORAGE_CREDENTIAL", < "full_name": "TEST_A_LOCATION", < "id": "98c25265-fb0f-4b15-a727-63855b7f78a7", < "isolation_mode": "ISOLATION_MODE_OPEN", < "metastore_id": "8952c1e3-b265-4adf-98c3-6f755e2e1453", < "name": "TEST_A_LOCATION", < "owner": "labs.scope.account-admin", < "read_only": false, < "securable_kind": "EXTERNAL_LOCATION_STANDARD", < "securable_type": "EXTERNAL_LOCATION", < "updated_at": 1712566812808, < "updated_by": "serge.smertin@databricks.com", < "url": "TEST_MOUNT_CONTAINER/a" < }, < "... (3 additional elements)" < ] < } 07:17 DEBUG [databricks.labs.blueprint.installation:migrate_dbfs_root_delta_tables] Loading list from CLOUD_ENV_storage_account_info.csv 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_delta_tables] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/CLOUD_ENV_storage_account_info.csv&direct_download=true < 404 Not Found < { < "error_code": "RESOURCE_DOES_NOT_EXIST", < "message": "Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/CLOUD_ENV_storage_account_info.csv) doesn't ... (6 more bytes)" < } 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.grants] fetching grants inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.grants] crawling new batch for grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.tables] fetching tables inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.tables 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.tables] crawling new batch for tables 07:17 DEBUG [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_delta_tables] [hive_metastore.migrate_jnvkp] listing tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SHOW TABLES FROM hive_metastore.migrate_jnvkp 07:17 WARNING [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_delta_tables] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.tables] found 0 new records for tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.tables (catalog STRING NOT NULL, database ST... (222 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.udfs] fetching udfs inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.udfs 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.udfs] crawling new batch for udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][execute] USE CATALOG hive_metastore; 07:17 DEBUG [databricks.labs.ucx.hive_metastore.udfs:migrate_dbfs_root_delta_tables] [hive_metastore.migrate_jnvkp] listing udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SHOW USER FUNCTIONS FROM hive_metastore.migrate_jnvkp; 07:17 WARNING [databricks.labs.ucx.hive_metastore.udfs:migrate_dbfs_root_delta_tables] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.udfs] found 0 new records for udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.udfs (catalog STRING NOT NULL, database STRI... (288 more bytes) 07:17 DEBUG [databricks.labs.blueprint.parallel:migrate_dbfs_root_delta_tables] Starting 4 tasks in 8 threads 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SHOW GRANTS ON CATALOG hive_metastore 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SHOW GRANTS ON ANY FILE 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SHOW GRANTS ON DATABASE hive_metastore.migrate_jnvkp 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SHOW GRANTS ON ANONYMOUS FUNCTION 07:17 ERROR [databricks.labs.ucx.hive_metastore.grants:migrate_dbfs_root_delta_tables] Couldn't fetch grants for object DATABASE hive_metastore.migrate_jnvkp: (org.apache.spark.SparkSecurityException) Database(migrate_jnvkp,Some(hive_metastore)) does not exist. JVM stacktrace: org.apache.spark.SparkSecurityException at com.databricks.sql.acl.AclCommand.$anonfun$mapIfExists$1(commands.scala:79) at scala.Option.getOrElse(Option.scala:189) at com.databricks.sql.acl.AclCommand.mapIfExists(commands.scala:79) at com.databricks.sql.acl.AclCommand.mapIfExists$(commands.scala:75) at com.databricks.sql.acl.ShowPermissionsCommand.mapIfExists(commands.scala:226) at com.databricks.sql.acl.ShowPermissionsCommand.run(commands.scala:244) at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$2(commands.scala:84) at org.apache.spark.sql.execution.SparkPlan.runCommandWithAetherOff(SparkPlan.scala:180) at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:191) at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:84) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:81) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:80) at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:94) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$5(QueryExecution.scala:375) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$4(QueryExecution.scala:375) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:166) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:375) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$9(SQLExecution.scala:386) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:669) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:275) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:162) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:606) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:371) at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1098) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:367) at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:318) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:364) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:340) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:477) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:83) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:477) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:343) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:339) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:453) at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:340) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:400) at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:340) at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:277) at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:274) at org.apache.spark.sql.Dataset.(Dataset.scala:310) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:131) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1182) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1182) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:122) at org.apache.spark.sql.SparkSession.$anonfun$sql$4(SparkSession.scala:954) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:942) at org.apache.spark.sql.connect.planner.SparkConnectPlanner.handleSqlCommand(SparkConnectPlanner.scala:2700) at org.apache.spark.sql.connect.planner.SparkConnectPlanner.process(SparkConnectPlanner.scala:2653) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.handleCommand(ExecuteThreadRunner.scala:307) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1(ExecuteThreadRunner.scala:233) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1$adapted(ExecuteThreadRunner.scala:169) at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$2(SessionHolder.scala:350) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$1(SessionHolder.scala:350) at org.apache.spark.JobArtifactSet$.withActiveJobArtifactState(JobArtifactSet.scala:97) at org.apache.spark.sql.artifact.ArtifactManager.$anonfun$withResources$1(ArtifactManager.scala:84) at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:234) at org.apache.spark.sql.artifact.ArtifactManager.withResources(ArtifactManager.scala:83) at org.apache.spark.sql.connect.service.SessionHolder.withSession(SessionHolder.scala:349) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.executeInternal(ExecuteThreadRunner.scala:169) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.org$apache$spark$sql$connect$execution$ExecuteThreadRunner$$execute(ExecuteThreadRunner.scala:119) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.$anonfun$run$1(ExecuteThreadRunner.scala:511) at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45) at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:103) at com.databricks.unity.HandleImpl.$anonfun$runWithAndClose$1(UCSHandle.scala:108) at scala.util.Using$.resource(Using.scala:269) at com.databricks.unity.HandleImpl.runWithAndClose(UCSHandle.scala:107) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.run(ExecuteThreadRunner.scala:510) 07:17 INFO [databricks.labs.blueprint.parallel:migrate_external_tables_sync] listing grants for hive_metastore 4/4, rps: 0.130/sec 07:17 INFO [databricks.labs.blueprint.parallel:migrate_external_tables_sync] Finished 'listing grants for hive_metastore' tasks: 100% results available (4/4). Took 0:00:31.637583 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.grants] found 1 new records for grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.groups] fetching groups inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.groups 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.groups] crawling new batch for groups 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_external_tables_sync] Listing workspace groups (resource_type=WorkspaceGroup) with id,displayName,meta,externalId,members,roles,entitlements... 07:17 DEBUG [databricks.sdk:migrate_external_tables_sync] GET /api/2.0/preview/scim/v2/Groups?attributes=id,displayName,meta,externalId,roles,entitlements&startIndex=1&count=100 < 200 OK < { < "Resources": [ < { < "displayName": "role.labs.tempo.write", < "externalId": "8d2db608-4ed1-49f2-ad5c-fe942be7a4e1", < "id": "22190446071900", < "meta": { < "resourceType": "Group" < } < }, < "... (58 additional elements)" < ], < "itemsPerPage": 59, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 1, < "totalResults": 59 < } 07:17 DEBUG [databricks.sdk:migrate_external_tables_sync] GET /api/2.0/preview/scim/v2/Groups?attributes=id,displayName,meta,externalId,roles,entitlements&startIndex=60&count=100 < 200 OK < { < "itemsPerPage": 0, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 60, < "totalResults": 59 < } 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_external_tables_sync] Found 0 WorkspaceGroup 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_external_tables_sync] Listing account groups with id,displayName,externalId... 07:17 DEBUG [databricks.sdk:migrate_external_tables_sync] GET /api/2.0/account/scim/v2/Groups?attributes=id,displayName,externalId < 200 OK < { < "Resources": [ < { < "displayName": "ucx_EMQk", < "id": "747915403144" < }, < { < "displayName": "rename-LFcF-ucx_GlGZb", < "id": "839447153138" < }, < "... (2192 additional elements)" < ], < "itemsPerPage": 2194, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 1, < "totalResults": 2194 < } 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_external_tables_sync] Found 2193 account groups 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_external_tables_sync] No group listing provided, all matching groups will be migrated 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.groups] found 0 new records for groups 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.groups (id_in_workspace STRING NOT NULL, nam... (179 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.tables] fetching tables inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.tables 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.tables] crawling new batch for tables 07:17 DEBUG [databricks.labs.ucx.hive_metastore.tables:migrate_external_tables_sync] [hive_metastore.migrate_jnvkp] listing tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SHOW TABLES FROM hive_metastore.migrate_jnvkp 07:17 WARNING [databricks.labs.ucx.hive_metastore.tables:migrate_external_tables_sync] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.tables] found 0 new records for tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.tables (catalog STRING NOT NULL, database ST... (222 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.mounts] fetching mounts inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SELECT * FROM ucx_soysi.mounts 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.mounts] crawling new batch for mounts 07:17 ERROR [databricks.labs.ucx:migrate_external_tables_sync] Execute `databricks workspace export //Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/logs/migrate-tables/run-786487235418832-0/migrate_external_tables_sync.log` locally to troubleshoot with more details. An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750) 07:17 DEBUG [databricks:migrate_external_tables_sync] Task crash details Traceback (most recent call last): File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-539676c8-28e2-41ee-b2a3-49961882350f/lib/python3.11/site-packages/databricks/labs/ucx/runtime.py", line 96, in trigger current_task(ctx) File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-539676c8-28e2-41ee-b2a3-49961882350f/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/workflows.py", line 16, in migrate_external_tables_sync ctx.tables_migrator.migrate_tables( File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-539676c8-28e2-41ee-b2a3-49961882350f/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/table_migrate.py", line 80, in migrate_tables all_principal_grants = None if acl_strategy is None else self._principal_grants.get_interactive_cluster_grants() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-539676c8-28e2-41ee-b2a3-49961882350f/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/grants.py", line 533, in get_interactive_cluster_grants mounts = list(self._mounts_crawler.snapshot()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-539676c8-28e2-41ee-b2a3-49961882350f/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/locations.py", line 252, in snapshot return self._snapshot(self._try_fetch, self._list_mounts) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-539676c8-28e2-41ee-b2a3-49961882350f/lib/python3.11/site-packages/databricks/labs/ucx/framework/crawlers.py", line 116, in _snapshot loaded_records = list(loader()) ^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-539676c8-28e2-41ee-b2a3-49961882350f/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/locations.py", line 247, in _list_mounts for mount_point, source, _ in self._dbutils.fs.mounts(): ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/databricks/python_shell/dbruntime/dbutils.py", line 362, in f_with_exception_handling return f(*args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "/databricks/python_shell/dbruntime/dbutils.py", line 497, in mounts self.print_return(self.dbcore.mounts()), MountInfo.create_from_jschema) ^^^^^^^^^^^^^^^^^^^^ File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1355, in __call__ return_value = get_return_value( ^^^^^^^^^^^^^^^^^ File "/databricks/spark/python/pyspark/errors/exceptions/captured.py", line 224, in deco return f(*a, **kw) ^^^^^^^^^^^ File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 330, in get_return_value raise Py4JError( py4j.protocol.Py4JError: An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750) 07:17 INFO [databricks.labs.blueprint.parallel:migrate_dbfs_root_non_delta_tables] listing grants for hive_metastore 4/4, rps: 0.127/sec 07:17 INFO [databricks.labs.blueprint.parallel:migrate_dbfs_root_non_delta_tables] Finished 'listing grants for hive_metastore' tasks: 100% results available (4/4). Took 0:00:32.088997 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.grants] found 1 new records for grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.groups] fetching groups inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.groups 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.groups] crawling new batch for groups 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_non_delta_tables] Listing workspace groups (resource_type=WorkspaceGroup) with id,displayName,meta,externalId,members,roles,entitlements... 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_non_delta_tables] GET /api/2.0/preview/scim/v2/Groups?attributes=id,displayName,meta,externalId,roles,entitlements&startIndex=1&count=100 < 200 OK < { < "Resources": [ < { < "displayName": "role.labs.tempo.write", < "externalId": "8d2db608-4ed1-49f2-ad5c-fe942be7a4e1", < "id": "22190446071900", < "meta": { < "resourceType": "Group" < } < }, < "... (58 additional elements)" < ], < "itemsPerPage": 59, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 1, < "totalResults": 59 < } 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_non_delta_tables] GET /api/2.0/preview/scim/v2/Groups?attributes=id,displayName,meta,externalId,roles,entitlements&startIndex=60&count=100 < 200 OK < { < "itemsPerPage": 0, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 60, < "totalResults": 59 < } 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_non_delta_tables] Found 0 WorkspaceGroup 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_non_delta_tables] Listing account groups with id,displayName,externalId... 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_non_delta_tables] GET /api/2.0/account/scim/v2/Groups?attributes=id,displayName,externalId < 200 OK < { < "Resources": [ < { < "displayName": "ucx_EMQk", < "id": "747915403144" < }, < { < "displayName": "rename-LFcF-ucx_GlGZb", < "id": "839447153138" < }, < "... (2192 additional elements)" < ], < "itemsPerPage": 2194, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 1, < "totalResults": 2194 < } 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_non_delta_tables] Found 2193 account groups 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_non_delta_tables] No group listing provided, all matching groups will be migrated 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.groups] found 0 new records for groups 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.groups (id_in_workspace STRING NOT NULL, nam... (179 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.tables] fetching tables inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.tables 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.tables] crawling new batch for tables 07:17 DEBUG [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_non_delta_tables] [hive_metastore.migrate_jnvkp] listing tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SHOW TABLES FROM hive_metastore.migrate_jnvkp 07:17 WARNING [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_non_delta_tables] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.tables] found 0 new records for tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.tables (catalog STRING NOT NULL, database ST... (222 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.mounts] fetching mounts inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SELECT * FROM ucx_soysi.mounts 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.mounts] crawling new batch for mounts 07:17 ERROR [databricks.labs.ucx:migrate_dbfs_root_non_delta_tables] Execute `databricks workspace export //Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/logs/migrate-tables/run-786487235418832-0/migrate_dbfs_root_non_delta_tables.log` locally to troubleshoot with more details. An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750) 07:17 DEBUG [databricks:migrate_dbfs_root_non_delta_tables] Task crash details Traceback (most recent call last): File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-c9a7f46a-ac90-467f-a4e5-775c32215269/lib/python3.11/site-packages/databricks/labs/ucx/runtime.py", line 96, in trigger current_task(ctx) File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-c9a7f46a-ac90-467f-a4e5-775c32215269/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/workflows.py", line 42, in migrate_dbfs_root_non_delta_tables ctx.tables_migrator.migrate_tables( File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-c9a7f46a-ac90-467f-a4e5-775c32215269/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/table_migrate.py", line 80, in migrate_tables all_principal_grants = None if acl_strategy is None else self._principal_grants.get_interactive_cluster_grants() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-c9a7f46a-ac90-467f-a4e5-775c32215269/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/grants.py", line 533, in get_interactive_cluster_grants mounts = list(self._mounts_crawler.snapshot()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-c9a7f46a-ac90-467f-a4e5-775c32215269/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/locations.py", line 252, in snapshot return self._snapshot(self._try_fetch, self._list_mounts) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-c9a7f46a-ac90-467f-a4e5-775c32215269/lib/python3.11/site-packages/databricks/labs/ucx/framework/crawlers.py", line 116, in _snapshot loaded_records = list(loader()) ^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-c9a7f46a-ac90-467f-a4e5-775c32215269/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/locations.py", line 247, in _list_mounts for mount_point, source, _ in self._dbutils.fs.mounts(): ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/databricks/python_shell/dbruntime/dbutils.py", line 362, in f_with_exception_handling return f(*args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "/databricks/python_shell/dbruntime/dbutils.py", line 497, in mounts self.print_return(self.dbcore.mounts()), MountInfo.create_from_jschema) ^^^^^^^^^^^^^^^^^^^^ File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1355, in __call__ return_value = get_return_value( ^^^^^^^^^^^^^^^^^ File "/databricks/spark/python/pyspark/errors/exceptions/captured.py", line 224, in deco return f(*a, **kw) ^^^^^^^^^^^ File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 330, in get_return_value raise Py4JError( py4j.protocol.Py4JError: An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750) 07:17 INFO [databricks.labs.blueprint.parallel:migrate_dbfs_root_delta_tables] listing grants for hive_metastore 4/4, rps: 0.128/sec 07:17 INFO [databricks.labs.blueprint.parallel:migrate_dbfs_root_delta_tables] Finished 'listing grants for hive_metastore' tasks: 100% results available (4/4). Took 0:00:32.160964 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.grants] found 1 new records for grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.groups] fetching groups inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.groups 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.groups] crawling new batch for groups 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_delta_tables] Listing workspace groups (resource_type=WorkspaceGroup) with id,displayName,meta,externalId,members,roles,entitlements... 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_delta_tables] GET /api/2.0/preview/scim/v2/Groups?attributes=id,displayName,meta,externalId,roles,entitlements&startIndex=1&count=100 < 200 OK < { < "Resources": [ < { < "displayName": "role.labs.tempo.write", < "externalId": "8d2db608-4ed1-49f2-ad5c-fe942be7a4e1", < "id": "22190446071900", < "meta": { < "resourceType": "Group" < } < }, < "... (58 additional elements)" < ], < "itemsPerPage": 59, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 1, < "totalResults": 59 < } 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_delta_tables] GET /api/2.0/preview/scim/v2/Groups?attributes=id,displayName,meta,externalId,roles,entitlements&startIndex=60&count=100 < 200 OK < { < "itemsPerPage": 0, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 60, < "totalResults": 59 < } 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_delta_tables] Found 0 WorkspaceGroup 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_delta_tables] Listing account groups with id,displayName,externalId... 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_delta_tables] GET /api/2.0/account/scim/v2/Groups?attributes=id,displayName,externalId < 200 OK < { < "Resources": [ < { < "displayName": "ucx_EMQk", < "id": "747915403144" < }, < { < "displayName": "rename-LFcF-ucx_GlGZb", < "id": "839447153138" < }, < "... (2192 additional elements)" < ], < "itemsPerPage": 2194, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 1, < "totalResults": 2194 < } 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_delta_tables] Found 2193 account groups 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_delta_tables] No group listing provided, all matching groups will be migrated 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.groups] found 0 new records for groups 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.groups (id_in_workspace STRING NOT NULL, nam... (179 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.tables] fetching tables inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.tables 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.tables] crawling new batch for tables 07:17 DEBUG [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_delta_tables] [hive_metastore.migrate_jnvkp] listing tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SHOW TABLES FROM hive_metastore.migrate_jnvkp 07:17 WARNING [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_delta_tables] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.tables] found 0 new records for tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.tables (catalog STRING NOT NULL, database ST... (222 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.mounts] fetching mounts inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SELECT * FROM ucx_soysi.mounts 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.mounts] crawling new batch for mounts 07:17 ERROR [databricks.labs.ucx:migrate_dbfs_root_delta_tables] Execute `databricks workspace export //Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/logs/migrate-tables/run-786487235418832-0/migrate_dbfs_root_delta_tables.log` locally to troubleshoot with more details. An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750) 07:17 DEBUG [databricks:migrate_dbfs_root_delta_tables] Task crash details Traceback (most recent call last): File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-62f71e94-0f44-488a-88e0-bdc7e54d4bdd/lib/python3.11/site-packages/databricks/labs/ucx/runtime.py", line 96, in trigger current_task(ctx) File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-62f71e94-0f44-488a-88e0-bdc7e54d4bdd/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/workflows.py", line 29, in migrate_dbfs_root_delta_tables ctx.tables_migrator.migrate_tables( File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-62f71e94-0f44-488a-88e0-bdc7e54d4bdd/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/table_migrate.py", line 80, in migrate_tables all_principal_grants = None if acl_strategy is None else self._principal_grants.get_interactive_cluster_grants() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-62f71e94-0f44-488a-88e0-bdc7e54d4bdd/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/grants.py", line 533, in get_interactive_cluster_grants mounts = list(self._mounts_crawler.snapshot()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-62f71e94-0f44-488a-88e0-bdc7e54d4bdd/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/locations.py", line 252, in snapshot return self._snapshot(self._try_fetch, self._list_mounts) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-62f71e94-0f44-488a-88e0-bdc7e54d4bdd/lib/python3.11/site-packages/databricks/labs/ucx/framework/crawlers.py", line 116, in _snapshot loaded_records = list(loader()) ^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-62f71e94-0f44-488a-88e0-bdc7e54d4bdd/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/locations.py", line 247, in _list_mounts for mount_point, source, _ in self._dbutils.fs.mounts(): ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/databricks/python_shell/dbruntime/dbutils.py", line 362, in f_with_exception_handling return f(*args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "/databricks/python_shell/dbruntime/dbutils.py", line 497, in mounts self.print_return(self.dbcore.mounts()), MountInfo.create_from_jschema) ^^^^^^^^^^^^^^^^^^^^ File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1355, in __call__ return_value = get_return_value( ^^^^^^^^^^^^^^^^^ File "/databricks/spark/python/pyspark/errors/exceptions/captured.py", line 224, in deco return f(*a, **kw) ^^^^^^^^^^^ File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 330, in get_return_value raise Py4JError( py4j.protocol.Py4JError: An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750) 07:17 INFO [databricks.labs.ucx.installer.workflows] ---------- END REMOTE LOGS ---------- 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/ceyw 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.migrate_jnvkp: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_jnvkp', metastore_id=None, name='migrate_jnvkp', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_tpi07: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_tpi07 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_tpi07', metastore_id=None, name='ucx_tpi07', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_jnvkp/ucx_tpi07', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_t3h9e: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_t3h9e 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_t3h9e', metastore_id=None, name='ucx_t3h9e', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_gbYm', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_twraa: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_twraa 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_twraa', metastore_id=None, name='ucx_twraa', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/ceyw', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_t14l3: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_t14l3 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_t14l3', metastore_id=None, name='ucx_t14l3', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_jnvkp.ucx_tpi07', view_dependencies=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_jnvkp.ucx_t7mog: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_jnvkp/ucx_t7mog 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_t7mog', metastore_id=None, name='ucx_t7mog', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_jnvkp.ucx_t14l3', view_dependencies=None) 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716966219661, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_csp3b', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_csp3b', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716966219661, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema ucx_csp3b.migrate_jnvkp: https://DATABRICKS_HOST/explore/data/ucx_csp3b/migrate_jnvkp 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_csp3b', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_csp3b.migrate_jnvkp', metastore_id=None, name='migrate_jnvkp', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_soysi: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_soysi 07:03 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_soysi', metastore_id=None, name='ucx_soysi', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:03 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/config.yml) doesn't exist. 07:03 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration 07:03 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data. 07:03 INFO [databricks.labs.ucx.install] Fetching installations... 07:04 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI is corrupted. Skipping... 07:04 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy. 07:04 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:12 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:12 INFO [databricks.labs.ucx.install] Installing UCX v0.24.1+520240529071241 07:12 INFO [databricks.labs.ucx.install] Creating ucx schemas... 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=experimental-workflow-linter 07:12 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation 07:12 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/README for the next steps. 07:12 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-tables job: https://DATABRICKS_HOST#job/353371736950474 07:17 INFO [databricks.labs.ucx.installer.workflows] ---------- REMOTE LOGS -------------- 07:17 INFO [databricks.labs.ucx:migrate_external_tables_sync] UCX v0.24.1+520240529071241 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/logs/migrate-tables/run-786487235418832-0/migrate_external_tables_sync.log 07:17 DEBUG [databricks.sdk:migrate_external_tables_sync] GET /api/2.0/clusters/list < 200 OK < { < "clusters": [ < { < "autotermination_minutes": 60, < "CLOUD_ENV_attributes": { < "availability": "SPOT_WITH_FALLBACK_AZURE", < "first_on_demand": 2147483647, < "spot_bid_max_price": -1.0 < }, < "cluster_cores": 8.0, < "cluster_id": "TEST_EXT_HMS_CLUSTER_ID", < "cluster_memory_mb": 32768, < "cluster_name": "External Metastore", < "cluster_source": "UI", < "creator_user_name": "serge.smertin@databricks.com", < "data_security_mode": "USER_ISOLATION", < "TEST_SCHEMA_tags": { < "Budget": "opex.sales.labs", < "ClusterId": "TEST_EXT_HMS_CLUSTER_ID", < "ClusterName": "External Metastore", < "Creator": "serge.smertin@databricks.com", < "DatabricksInstanceGroupId": "-6693343645136663331", < "DatabricksInstancePoolCreatorId": "4183391249163402", < "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID", < "Owner": "labs-oss@databricks.com", < "Vendor": "Databricks" < }, < "disk_spec": {}, < "driver": { < "host_private_ip": "10.139.0.17", < "instance_id": "416a094689384e9b8eb4ac2d90434719", < "node_attributes": { < "is_spot": false < }, < "node_id": "081f050584b74828aa502ace88dce94f", < "private_ip": "10.139.64.17", < "public_dns": "104.209.188.220", < "start_timestamp": 1716966580624 < }, < "driver_healthy": true, < "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID", < "driver_instance_source": { < "instance_pool_id": "TEST_INSTANCE_POOL_ID" < }, < "driver_node_type_id": "Standard_D4s_v3", < "effective_spark_version": "15.1.x-scala2.12", < "enable_elastic_disk": true, < "enable_local_disk_encryption": false, < "executors": [ < { < "host_private_ip": "10.139.0.24", < "instance_id": "e8b9112441f04b02a29889d7a0b4b261", < "node_attributes": { < "is_spot": false < }, < "node_id": "9e0ed3e7b912431e84b9b9edb2651ae0", < "private_ip": "10.139.64.24", < "public_dns": "104.209.177.152", < "start_timestamp": 1716966580581 < } < ], < "init_scripts_safe_mode": false, < "instance_pool_id": "TEST_INSTANCE_POOL_ID", < "instance_source": { < "instance_pool_id": "TEST_INSTANCE_POOL_ID" < }, < "jdbc_port": 10000, < "last_activity_time": 1716966783501, < "last_restarted_time": 1716966763471, < "last_state_loss_time": 1716966763410, < "node_type_id": "Standard_D4s_v3", < "num_workers": 1, < "pinned_by_user_name": "4183391249163402", < "policy_id": "000E138775A879A0", < "spark_conf": { < "datanucleus.autoCreateSchema": "true", < "datanucleus.fixedDatastore": "true", < "spark.hadoop.javax.jdo.option.ConnectionDriverName": "com.microsoft.sqlserver.jdbc.SQLServerDriver", < "spark.hadoop.javax.jdo.option.ConnectionPassword": "{{secrets/external_metastore/external_metastore_password}}", < "spark.hadoop.javax.jdo.option.ConnectionURL": "{{secrets/external_metastore/external_metastore_url}}", < "spark.hadoop.javax.jdo.option.ConnectionUserName": "{{secrets/external_metastore/external_metastore_user}}", < "spark.hadoop.metastore.catalog.TEST_SCHEMA": "hive", < "spark.sql.hive.metastore.jars": "maven", < "spark.sql.hive.metastore.version": "3.1.0" < }, < "spark_context_id": 7713603744641915067, < "spark_version": "15.1.x-scala2.12", < "start_time": 1713528456326, < "state": "RUNNING", < "state_message": "" < }, < "... (8 additional elements)" < ] < } 07:17 DEBUG [databricks.sdk:migrate_external_tables_sync] GET /api/2.1/unity-catalog/external-locations < 200 OK < { < "external_locations": [ < { < "created_at": 1712317369786, < "created_by": "serge.smertin@databricks.com", < "credential_id": "462bd121-a3ff-4f51-899f-236868f3d2ab", < "credential_name": "TEST_STORAGE_CREDENTIAL", < "full_name": "TEST_A_LOCATION", < "id": "98c25265-fb0f-4b15-a727-63855b7f78a7", < "isolation_mode": "ISOLATION_MODE_OPEN", < "metastore_id": "8952c1e3-b265-4adf-98c3-6f755e2e1453", < "name": "TEST_A_LOCATION", < "owner": "labs.scope.account-admin", < "read_only": false, < "securable_kind": "EXTERNAL_LOCATION_STANDARD", < "securable_type": "EXTERNAL_LOCATION", < "updated_at": 1712566812808, < "updated_by": "serge.smertin@databricks.com", < "url": "TEST_MOUNT_CONTAINER/a" < }, < "... (3 additional elements)" < ] < } 07:17 DEBUG [databricks.labs.blueprint.installation:migrate_external_tables_sync] Loading list from CLOUD_ENV_storage_account_info.csv 07:17 DEBUG [databricks.sdk:migrate_external_tables_sync] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/CLOUD_ENV_storage_account_info.csv&direct_download=true < 404 Not Found < { < "error_code": "RESOURCE_DOES_NOT_EXIST", < "message": "Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/CLOUD_ENV_storage_account_info.csv) doesn't ... (6 more bytes)" < } 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.grants] fetching grants inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.grants] crawling new batch for grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.tables] fetching tables inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.tables 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.tables] crawling new batch for tables 07:17 DEBUG [databricks.labs.ucx.hive_metastore.tables:migrate_external_tables_sync] [hive_metastore.migrate_jnvkp] listing tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SHOW TABLES FROM hive_metastore.migrate_jnvkp 07:17 WARNING [databricks.labs.ucx.hive_metastore.tables:migrate_external_tables_sync] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.tables] found 0 new records for tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.tables (catalog STRING NOT NULL, database ST... (222 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.udfs] fetching udfs inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.udfs 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.udfs] crawling new batch for udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][execute] USE CATALOG hive_metastore; 07:17 DEBUG [databricks.labs.ucx.hive_metastore.udfs:migrate_external_tables_sync] [hive_metastore.migrate_jnvkp] listing udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SHOW USER FUNCTIONS FROM hive_metastore.migrate_jnvkp; 07:17 WARNING [databricks.labs.ucx.hive_metastore.udfs:migrate_external_tables_sync] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.udfs] found 0 new records for udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.udfs (catalog STRING NOT NULL, database STRI... (288 more bytes) 07:17 DEBUG [databricks.labs.blueprint.parallel:migrate_external_tables_sync] Starting 4 tasks in 8 threads 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SHOW GRANTS ON CATALOG hive_metastore 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SHOW GRANTS ON ANY FILE 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SHOW GRANTS ON ANONYMOUS FUNCTION 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SHOW GRANTS ON DATABASE hive_metastore.migrate_jnvkp 07:17 ERROR [databricks.labs.ucx.hive_metastore.grants:migrate_external_tables_sync] Couldn't fetch grants for object DATABASE hive_metastore.migrate_jnvkp: (org.apache.spark.SparkSecurityException) Database(migrate_jnvkp,Some(hive_metastore)) does not exist. JVM stacktrace: org.apache.spark.SparkSecurityException at com.databricks.sql.acl.AclCommand.$anonfun$mapIfExists$1(commands.scala:79) at scala.Option.getOrElse(Option.scala:189) at com.databricks.sql.acl.AclCommand.mapIfExists(commands.scala:79) at com.databricks.sql.acl.AclCommand.mapIfExists$(commands.scala:75) at com.databricks.sql.acl.ShowPermissionsCommand.mapIfExists(commands.scala:226) at com.databricks.sql.acl.ShowPermissionsCommand.run(commands.scala:244) at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$2(commands.scala:84) at org.apache.spark.sql.execution.SparkPlan.runCommandWithAetherOff(SparkPlan.scala:180) at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:191) at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:84) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:81) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:80) at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:94) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$5(QueryExecution.scala:375) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$4(QueryExecution.scala:375) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:166) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:375) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$9(SQLExecution.scala:386) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:669) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:275) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:162) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:606) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:371) at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1098) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:367) at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:318) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:364) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:340) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:477) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:83) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:477) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:343) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:339) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:453) at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:340) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:400) at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:340) at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:277) at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:274) at org.apache.spark.sql.Dataset.(Dataset.scala:310) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:131) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1182) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1182) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:122) at org.apache.spark.sql.SparkSession.$anonfun$sql$4(SparkSession.scala:954) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:942) at org.apache.spark.sql.connect.planner.SparkConnectPlanner.handleSqlCommand(SparkConnectPlanner.scala:2700) at org.apache.spark.sql.connect.planner.SparkConnectPlanner.process(SparkConnectPlanner.scala:2653) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.handleCommand(ExecuteThreadRunner.scala:307) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1(ExecuteThreadRunner.scala:233) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1$adapted(ExecuteThreadRunner.scala:169) at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$2(SessionHolder.scala:350) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$1(SessionHolder.scala:350) at org.apache.spark.JobArtifactSet$.withActiveJobArtifactState(JobArtifactSet.scala:97) at org.apache.spark.sql.artifact.ArtifactManager.$anonfun$withResources$1(ArtifactManager.scala:84) at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:234) at org.apache.spark.sql.artifact.ArtifactManager.withResources(ArtifactManager.scala:83) at org.apache.spark.sql.connect.service.SessionHolder.withSession(SessionHolder.scala:349) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.executeInternal(ExecuteThreadRunner.scala:169) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.org$apache$spark$sql$connect$execution$ExecuteThreadRunner$$execute(ExecuteThreadRunner.scala:119) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.$anonfun$run$1(ExecuteThreadRunner.scala:511) at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45) at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:103) at com.databricks.unity.HandleImpl.$anonfun$runWithAndClose$1(UCSHandle.scala:108) at scala.util.Using$.resource(Using.scala:269) at com.databricks.unity.HandleImpl.runWithAndClose(UCSHandle.scala:107) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.run(ExecuteThreadRunner.scala:510) 07:17 INFO [databricks.labs.ucx:migrate_dbfs_root_non_delta_tables] UCX v0.24.1+520240529071241 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/logs/migrate-tables/run-786487235418832-0/migrate_dbfs_root_non_delta_tables.log 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_non_delta_tables] GET /api/2.0/clusters/list < 200 OK < { < "clusters": [ < { < "autotermination_minutes": 60, < "CLOUD_ENV_attributes": { < "availability": "SPOT_WITH_FALLBACK_AZURE", < "first_on_demand": 2147483647, < "spot_bid_max_price": -1.0 < }, < "cluster_cores": 8.0, < "cluster_id": "TEST_EXT_HMS_CLUSTER_ID", < "cluster_memory_mb": 32768, < "cluster_name": "External Metastore", < "cluster_source": "UI", < "creator_user_name": "serge.smertin@databricks.com", < "data_security_mode": "USER_ISOLATION", < "TEST_SCHEMA_tags": { < "Budget": "opex.sales.labs", < "ClusterId": "TEST_EXT_HMS_CLUSTER_ID", < "ClusterName": "External Metastore", < "Creator": "serge.smertin@databricks.com", < "DatabricksInstanceGroupId": "-6693343645136663331", < "DatabricksInstancePoolCreatorId": "4183391249163402", < "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID", < "Owner": "labs-oss@databricks.com", < "Vendor": "Databricks" < }, < "disk_spec": {}, < "driver": { < "host_private_ip": "10.139.0.17", < "instance_id": "416a094689384e9b8eb4ac2d90434719", < "node_attributes": { < "is_spot": false < }, < "node_id": "081f050584b74828aa502ace88dce94f", < "private_ip": "10.139.64.17", < "public_dns": "104.209.188.220", < "start_timestamp": 1716966580624 < }, < "driver_healthy": true, < "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID", < "driver_instance_source": { < "instance_pool_id": "TEST_INSTANCE_POOL_ID" < }, < "driver_node_type_id": "Standard_D4s_v3", < "effective_spark_version": "15.1.x-scala2.12", < "enable_elastic_disk": true, < "enable_local_disk_encryption": false, < "executors": [ < { < "host_private_ip": "10.139.0.24", < "instance_id": "e8b9112441f04b02a29889d7a0b4b261", < "node_attributes": { < "is_spot": false < }, < "node_id": "9e0ed3e7b912431e84b9b9edb2651ae0", < "private_ip": "10.139.64.24", < "public_dns": "104.209.177.152", < "start_timestamp": 1716966580581 < } < ], < "init_scripts_safe_mode": false, < "instance_pool_id": "TEST_INSTANCE_POOL_ID", < "instance_source": { < "instance_pool_id": "TEST_INSTANCE_POOL_ID" < }, < "jdbc_port": 10000, < "last_activity_time": 1716966783501, < "last_restarted_time": 1716966763471, < "last_state_loss_time": 1716966763410, < "node_type_id": "Standard_D4s_v3", < "num_workers": 1, < "pinned_by_user_name": "4183391249163402", < "policy_id": "000E138775A879A0", < "spark_conf": { < "datanucleus.autoCreateSchema": "true", < "datanucleus.fixedDatastore": "true", < "spark.hadoop.javax.jdo.option.ConnectionDriverName": "com.microsoft.sqlserver.jdbc.SQLServerDriver", < "spark.hadoop.javax.jdo.option.ConnectionPassword": "{{secrets/external_metastore/external_metastore_password}}", < "spark.hadoop.javax.jdo.option.ConnectionURL": "{{secrets/external_metastore/external_metastore_url}}", < "spark.hadoop.javax.jdo.option.ConnectionUserName": "{{secrets/external_metastore/external_metastore_user}}", < "spark.hadoop.metastore.catalog.TEST_SCHEMA": "hive", < "spark.sql.hive.metastore.jars": "maven", < "spark.sql.hive.metastore.version": "3.1.0" < }, < "spark_context_id": 7713603744641915067, < "spark_version": "15.1.x-scala2.12", < "start_time": 1713528456326, < "state": "RUNNING", < "state_message": "" < }, < "... (8 additional elements)" < ] < } 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_non_delta_tables] GET /api/2.1/unity-catalog/external-locations < 200 OK < { < "external_locations": [ < { < "created_at": 1712317369786, < "created_by": "serge.smertin@databricks.com", < "credential_id": "462bd121-a3ff-4f51-899f-236868f3d2ab", < "credential_name": "TEST_STORAGE_CREDENTIAL", < "full_name": "TEST_A_LOCATION", < "id": "98c25265-fb0f-4b15-a727-63855b7f78a7", < "isolation_mode": "ISOLATION_MODE_OPEN", < "metastore_id": "8952c1e3-b265-4adf-98c3-6f755e2e1453", < "name": "TEST_A_LOCATION", < "owner": "labs.scope.account-admin", < "read_only": false, < "securable_kind": "EXTERNAL_LOCATION_STANDARD", < "securable_type": "EXTERNAL_LOCATION", < "updated_at": 1712566812808, < "updated_by": "serge.smertin@databricks.com", < "url": "TEST_MOUNT_CONTAINER/a" < }, < "... (3 additional elements)" < ] < } 07:17 DEBUG [databricks.labs.blueprint.installation:migrate_dbfs_root_non_delta_tables] Loading list from CLOUD_ENV_storage_account_info.csv 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_non_delta_tables] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/CLOUD_ENV_storage_account_info.csv&direct_download=true < 404 Not Found < { < "error_code": "RESOURCE_DOES_NOT_EXIST", < "message": "Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/CLOUD_ENV_storage_account_info.csv) doesn't ... (6 more bytes)" < } 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.grants] fetching grants inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.grants] crawling new batch for grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.tables] fetching tables inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.tables 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.tables] crawling new batch for tables 07:17 DEBUG [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_non_delta_tables] [hive_metastore.migrate_jnvkp] listing tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SHOW TABLES FROM hive_metastore.migrate_jnvkp 07:17 WARNING [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_non_delta_tables] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.tables] found 0 new records for tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.tables (catalog STRING NOT NULL, database ST... (222 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.udfs] fetching udfs inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.udfs 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.udfs] crawling new batch for udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][execute] USE CATALOG hive_metastore; 07:17 DEBUG [databricks.labs.ucx.hive_metastore.udfs:migrate_dbfs_root_non_delta_tables] [hive_metastore.migrate_jnvkp] listing udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SHOW USER FUNCTIONS FROM hive_metastore.migrate_jnvkp; 07:17 WARNING [databricks.labs.ucx.hive_metastore.udfs:migrate_dbfs_root_non_delta_tables] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.udfs] found 0 new records for udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.udfs (catalog STRING NOT NULL, database STRI... (288 more bytes) 07:17 DEBUG [databricks.labs.blueprint.parallel:migrate_dbfs_root_non_delta_tables] Starting 4 tasks in 8 threads 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SHOW GRANTS ON CATALOG hive_metastore 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SHOW GRANTS ON ANY FILE 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SHOW GRANTS ON DATABASE hive_metastore.migrate_jnvkp 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SHOW GRANTS ON ANONYMOUS FUNCTION 07:17 ERROR [databricks.labs.ucx.hive_metastore.grants:migrate_dbfs_root_non_delta_tables] Couldn't fetch grants for object DATABASE hive_metastore.migrate_jnvkp: (org.apache.spark.SparkSecurityException) Database(migrate_jnvkp,Some(hive_metastore)) does not exist. JVM stacktrace: org.apache.spark.SparkSecurityException at com.databricks.sql.acl.AclCommand.$anonfun$mapIfExists$1(commands.scala:79) at scala.Option.getOrElse(Option.scala:189) at com.databricks.sql.acl.AclCommand.mapIfExists(commands.scala:79) at com.databricks.sql.acl.AclCommand.mapIfExists$(commands.scala:75) at com.databricks.sql.acl.ShowPermissionsCommand.mapIfExists(commands.scala:226) at com.databricks.sql.acl.ShowPermissionsCommand.run(commands.scala:244) at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$2(commands.scala:84) at org.apache.spark.sql.execution.SparkPlan.runCommandWithAetherOff(SparkPlan.scala:180) at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:191) at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:84) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:81) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:80) at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:94) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$5(QueryExecution.scala:375) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$4(QueryExecution.scala:375) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:166) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:375) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$9(SQLExecution.scala:386) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:669) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:275) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:162) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:606) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:371) at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1098) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:367) at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:318) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:364) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:340) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:477) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:83) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:477) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:343) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:339) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:453) at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:340) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:400) at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:340) at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:277) at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:274) at org.apache.spark.sql.Dataset.(Dataset.scala:310) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:131) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1182) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1182) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:122) at org.apache.spark.sql.SparkSession.$anonfun$sql$4(SparkSession.scala:954) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:942) at org.apache.spark.sql.connect.planner.SparkConnectPlanner.handleSqlCommand(SparkConnectPlanner.scala:2700) at org.apache.spark.sql.connect.planner.SparkConnectPlanner.process(SparkConnectPlanner.scala:2653) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.handleCommand(ExecuteThreadRunner.scala:307) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1(ExecuteThreadRunner.scala:233) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1$adapted(ExecuteThreadRunner.scala:169) at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$2(SessionHolder.scala:350) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$1(SessionHolder.scala:350) at org.apache.spark.JobArtifactSet$.withActiveJobArtifactState(JobArtifactSet.scala:97) at org.apache.spark.sql.artifact.ArtifactManager.$anonfun$withResources$1(ArtifactManager.scala:84) at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:234) at org.apache.spark.sql.artifact.ArtifactManager.withResources(ArtifactManager.scala:83) at org.apache.spark.sql.connect.service.SessionHolder.withSession(SessionHolder.scala:349) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.executeInternal(ExecuteThreadRunner.scala:169) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.org$apache$spark$sql$connect$execution$ExecuteThreadRunner$$execute(ExecuteThreadRunner.scala:119) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.$anonfun$run$1(ExecuteThreadRunner.scala:511) at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45) at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:103) at com.databricks.unity.HandleImpl.$anonfun$runWithAndClose$1(UCSHandle.scala:108) at scala.util.Using$.resource(Using.scala:269) at com.databricks.unity.HandleImpl.runWithAndClose(UCSHandle.scala:107) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.run(ExecuteThreadRunner.scala:510) 07:17 INFO [databricks.labs.ucx:migrate_dbfs_root_delta_tables] UCX v0.24.1+520240529071241 After job finishes, see debug logs at /Workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/logs/migrate-tables/run-786487235418832-0/migrate_dbfs_root_delta_tables.log 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_delta_tables] GET /api/2.0/clusters/list < 200 OK < { < "clusters": [ < { < "autotermination_minutes": 60, < "CLOUD_ENV_attributes": { < "availability": "SPOT_WITH_FALLBACK_AZURE", < "first_on_demand": 2147483647, < "spot_bid_max_price": -1.0 < }, < "cluster_cores": 8.0, < "cluster_id": "TEST_EXT_HMS_CLUSTER_ID", < "cluster_memory_mb": 32768, < "cluster_name": "External Metastore", < "cluster_source": "UI", < "creator_user_name": "serge.smertin@databricks.com", < "data_security_mode": "USER_ISOLATION", < "TEST_SCHEMA_tags": { < "Budget": "opex.sales.labs", < "ClusterId": "TEST_EXT_HMS_CLUSTER_ID", < "ClusterName": "External Metastore", < "Creator": "serge.smertin@databricks.com", < "DatabricksInstanceGroupId": "-6693343645136663331", < "DatabricksInstancePoolCreatorId": "4183391249163402", < "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID", < "Owner": "labs-oss@databricks.com", < "Vendor": "Databricks" < }, < "disk_spec": {}, < "driver": { < "host_private_ip": "10.139.0.17", < "instance_id": "416a094689384e9b8eb4ac2d90434719", < "node_attributes": { < "is_spot": false < }, < "node_id": "081f050584b74828aa502ace88dce94f", < "private_ip": "10.139.64.17", < "public_dns": "104.209.188.220", < "start_timestamp": 1716966580624 < }, < "driver_healthy": true, < "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID", < "driver_instance_source": { < "instance_pool_id": "TEST_INSTANCE_POOL_ID" < }, < "driver_node_type_id": "Standard_D4s_v3", < "effective_spark_version": "15.1.x-scala2.12", < "enable_elastic_disk": true, < "enable_local_disk_encryption": false, < "executors": [ < { < "host_private_ip": "10.139.0.24", < "instance_id": "e8b9112441f04b02a29889d7a0b4b261", < "node_attributes": { < "is_spot": false < }, < "node_id": "9e0ed3e7b912431e84b9b9edb2651ae0", < "private_ip": "10.139.64.24", < "public_dns": "104.209.177.152", < "start_timestamp": 1716966580581 < } < ], < "init_scripts_safe_mode": false, < "instance_pool_id": "TEST_INSTANCE_POOL_ID", < "instance_source": { < "instance_pool_id": "TEST_INSTANCE_POOL_ID" < }, < "jdbc_port": 10000, < "last_activity_time": 1716966783501, < "last_restarted_time": 1716966763471, < "last_state_loss_time": 1716966763410, < "node_type_id": "Standard_D4s_v3", < "num_workers": 1, < "pinned_by_user_name": "4183391249163402", < "policy_id": "000E138775A879A0", < "spark_conf": { < "datanucleus.autoCreateSchema": "true", < "datanucleus.fixedDatastore": "true", < "spark.hadoop.javax.jdo.option.ConnectionDriverName": "com.microsoft.sqlserver.jdbc.SQLServerDriver", < "spark.hadoop.javax.jdo.option.ConnectionPassword": "{{secrets/external_metastore/external_metastore_password}}", < "spark.hadoop.javax.jdo.option.ConnectionURL": "{{secrets/external_metastore/external_metastore_url}}", < "spark.hadoop.javax.jdo.option.ConnectionUserName": "{{secrets/external_metastore/external_metastore_user}}", < "spark.hadoop.metastore.catalog.TEST_SCHEMA": "hive", < "spark.sql.hive.metastore.jars": "maven", < "spark.sql.hive.metastore.version": "3.1.0" < }, < "spark_context_id": 7713603744641915067, < "spark_version": "15.1.x-scala2.12", < "start_time": 1713528456326, < "state": "RUNNING", < "state_message": "" < }, < "... (8 additional elements)" < ] < } 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_delta_tables] GET /api/2.1/unity-catalog/external-locations < 200 OK < { < "external_locations": [ < { < "created_at": 1712317369786, < "created_by": "serge.smertin@databricks.com", < "credential_id": "462bd121-a3ff-4f51-899f-236868f3d2ab", < "credential_name": "TEST_STORAGE_CREDENTIAL", < "full_name": "TEST_A_LOCATION", < "id": "98c25265-fb0f-4b15-a727-63855b7f78a7", < "isolation_mode": "ISOLATION_MODE_OPEN", < "metastore_id": "8952c1e3-b265-4adf-98c3-6f755e2e1453", < "name": "TEST_A_LOCATION", < "owner": "labs.scope.account-admin", < "read_only": false, < "securable_kind": "EXTERNAL_LOCATION_STANDARD", < "securable_type": "EXTERNAL_LOCATION", < "updated_at": 1712566812808, < "updated_by": "serge.smertin@databricks.com", < "url": "TEST_MOUNT_CONTAINER/a" < }, < "... (3 additional elements)" < ] < } 07:17 DEBUG [databricks.labs.blueprint.installation:migrate_dbfs_root_delta_tables] Loading list from CLOUD_ENV_storage_account_info.csv 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_delta_tables] GET /api/2.0/workspace/export?path=/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/CLOUD_ENV_storage_account_info.csv&direct_download=true < 404 Not Found < { < "error_code": "RESOURCE_DOES_NOT_EXIST", < "message": "Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/CLOUD_ENV_storage_account_info.csv) doesn't ... (6 more bytes)" < } 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.grants] fetching grants inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.grants] crawling new batch for grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.tables] fetching tables inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.tables 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.tables] crawling new batch for tables 07:17 DEBUG [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_delta_tables] [hive_metastore.migrate_jnvkp] listing tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SHOW TABLES FROM hive_metastore.migrate_jnvkp 07:17 WARNING [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_delta_tables] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.tables] found 0 new records for tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.tables (catalog STRING NOT NULL, database ST... (222 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.udfs] fetching udfs inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.udfs 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.udfs] crawling new batch for udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][execute] USE CATALOG hive_metastore; 07:17 DEBUG [databricks.labs.ucx.hive_metastore.udfs:migrate_dbfs_root_delta_tables] [hive_metastore.migrate_jnvkp] listing udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SHOW USER FUNCTIONS FROM hive_metastore.migrate_jnvkp; 07:17 WARNING [databricks.labs.ucx.hive_metastore.udfs:migrate_dbfs_root_delta_tables] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.udfs] found 0 new records for udfs 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.udfs (catalog STRING NOT NULL, database STRI... (288 more bytes) 07:17 DEBUG [databricks.labs.blueprint.parallel:migrate_dbfs_root_delta_tables] Starting 4 tasks in 8 threads 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SHOW GRANTS ON CATALOG hive_metastore 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SHOW GRANTS ON ANY FILE 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SHOW GRANTS ON DATABASE hive_metastore.migrate_jnvkp 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SHOW GRANTS ON ANONYMOUS FUNCTION 07:17 ERROR [databricks.labs.ucx.hive_metastore.grants:migrate_dbfs_root_delta_tables] Couldn't fetch grants for object DATABASE hive_metastore.migrate_jnvkp: (org.apache.spark.SparkSecurityException) Database(migrate_jnvkp,Some(hive_metastore)) does not exist. JVM stacktrace: org.apache.spark.SparkSecurityException at com.databricks.sql.acl.AclCommand.$anonfun$mapIfExists$1(commands.scala:79) at scala.Option.getOrElse(Option.scala:189) at com.databricks.sql.acl.AclCommand.mapIfExists(commands.scala:79) at com.databricks.sql.acl.AclCommand.mapIfExists$(commands.scala:75) at com.databricks.sql.acl.ShowPermissionsCommand.mapIfExists(commands.scala:226) at com.databricks.sql.acl.ShowPermissionsCommand.run(commands.scala:244) at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$2(commands.scala:84) at org.apache.spark.sql.execution.SparkPlan.runCommandWithAetherOff(SparkPlan.scala:180) at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:191) at org.apache.spark.sql.execution.command.ExecutedCommandExec.$anonfun$sideEffectResult$1(commands.scala:84) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:81) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:80) at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:94) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$5(QueryExecution.scala:375) at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$4(QueryExecution.scala:375) at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:166) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:375) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$9(SQLExecution.scala:386) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:669) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:275) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:162) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:606) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:371) at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1098) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:367) at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:318) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:364) at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:340) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:477) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:83) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:477) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:343) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:339) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:453) at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:340) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:400) at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:340) at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:277) at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:274) at org.apache.spark.sql.Dataset.(Dataset.scala:310) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:131) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.SparkSession.$anonfun$withActiveAndFrameProfiler$1(SparkSession.scala:1182) at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94) at org.apache.spark.sql.SparkSession.withActiveAndFrameProfiler(SparkSession.scala:1182) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:122) at org.apache.spark.sql.SparkSession.$anonfun$sql$4(SparkSession.scala:954) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:942) at org.apache.spark.sql.connect.planner.SparkConnectPlanner.handleSqlCommand(SparkConnectPlanner.scala:2700) at org.apache.spark.sql.connect.planner.SparkConnectPlanner.process(SparkConnectPlanner.scala:2653) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.handleCommand(ExecuteThreadRunner.scala:307) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1(ExecuteThreadRunner.scala:233) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.$anonfun$executeInternal$1$adapted(ExecuteThreadRunner.scala:169) at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$2(SessionHolder.scala:350) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1175) at org.apache.spark.sql.connect.service.SessionHolder.$anonfun$withSession$1(SessionHolder.scala:350) at org.apache.spark.JobArtifactSet$.withActiveJobArtifactState(JobArtifactSet.scala:97) at org.apache.spark.sql.artifact.ArtifactManager.$anonfun$withResources$1(ArtifactManager.scala:84) at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:234) at org.apache.spark.sql.artifact.ArtifactManager.withResources(ArtifactManager.scala:83) at org.apache.spark.sql.connect.service.SessionHolder.withSession(SessionHolder.scala:349) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.executeInternal(ExecuteThreadRunner.scala:169) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner.org$apache$spark$sql$connect$execution$ExecuteThreadRunner$$execute(ExecuteThreadRunner.scala:119) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.$anonfun$run$1(ExecuteThreadRunner.scala:511) at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:45) at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:103) at com.databricks.unity.HandleImpl.$anonfun$runWithAndClose$1(UCSHandle.scala:108) at scala.util.Using$.resource(Using.scala:269) at com.databricks.unity.HandleImpl.runWithAndClose(UCSHandle.scala:107) at org.apache.spark.sql.connect.execution.ExecuteThreadRunner$ExecutionThread.run(ExecuteThreadRunner.scala:510) 07:17 INFO [databricks.labs.blueprint.parallel:migrate_external_tables_sync] listing grants for hive_metastore 4/4, rps: 0.130/sec 07:17 INFO [databricks.labs.blueprint.parallel:migrate_external_tables_sync] Finished 'listing grants for hive_metastore' tasks: 100% results available (4/4). Took 0:00:31.637583 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.grants] found 1 new records for grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.groups] fetching groups inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.groups 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.groups] crawling new batch for groups 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_external_tables_sync] Listing workspace groups (resource_type=WorkspaceGroup) with id,displayName,meta,externalId,members,roles,entitlements... 07:17 DEBUG [databricks.sdk:migrate_external_tables_sync] GET /api/2.0/preview/scim/v2/Groups?attributes=id,displayName,meta,externalId,roles,entitlements&startIndex=1&count=100 < 200 OK < { < "Resources": [ < { < "displayName": "role.labs.tempo.write", < "externalId": "8d2db608-4ed1-49f2-ad5c-fe942be7a4e1", < "id": "22190446071900", < "meta": { < "resourceType": "Group" < } < }, < "... (58 additional elements)" < ], < "itemsPerPage": 59, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 1, < "totalResults": 59 < } 07:17 DEBUG [databricks.sdk:migrate_external_tables_sync] GET /api/2.0/preview/scim/v2/Groups?attributes=id,displayName,meta,externalId,roles,entitlements&startIndex=60&count=100 < 200 OK < { < "itemsPerPage": 0, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 60, < "totalResults": 59 < } 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_external_tables_sync] Found 0 WorkspaceGroup 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_external_tables_sync] Listing account groups with id,displayName,externalId... 07:17 DEBUG [databricks.sdk:migrate_external_tables_sync] GET /api/2.0/account/scim/v2/Groups?attributes=id,displayName,externalId < 200 OK < { < "Resources": [ < { < "displayName": "ucx_EMQk", < "id": "747915403144" < }, < { < "displayName": "rename-LFcF-ucx_GlGZb", < "id": "839447153138" < }, < "... (2192 additional elements)" < ], < "itemsPerPage": 2194, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 1, < "totalResults": 2194 < } 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_external_tables_sync] Found 2193 account groups 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_external_tables_sync] No group listing provided, all matching groups will be migrated 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.groups] found 0 new records for groups 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.groups (id_in_workspace STRING NOT NULL, nam... (179 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.tables] fetching tables inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.tables 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.tables] crawling new batch for tables 07:17 DEBUG [databricks.labs.ucx.hive_metastore.tables:migrate_external_tables_sync] [hive_metastore.migrate_jnvkp] listing tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SHOW TABLES FROM hive_metastore.migrate_jnvkp 07:17 WARNING [databricks.labs.ucx.hive_metastore.tables:migrate_external_tables_sync] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.tables] found 0 new records for tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.tables (catalog STRING NOT NULL, database ST... (222 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.mounts] fetching mounts inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_external_tables_sync] [spark][fetch] SELECT * FROM ucx_soysi.mounts 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_external_tables_sync] [hive_metastore.ucx_soysi.mounts] crawling new batch for mounts 07:17 ERROR [databricks.labs.ucx:migrate_external_tables_sync] Execute `databricks workspace export //Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/logs/migrate-tables/run-786487235418832-0/migrate_external_tables_sync.log` locally to troubleshoot with more details. An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750) 07:17 DEBUG [databricks:migrate_external_tables_sync] Task crash details Traceback (most recent call last): File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-539676c8-28e2-41ee-b2a3-49961882350f/lib/python3.11/site-packages/databricks/labs/ucx/runtime.py", line 96, in trigger current_task(ctx) File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-539676c8-28e2-41ee-b2a3-49961882350f/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/workflows.py", line 16, in migrate_external_tables_sync ctx.tables_migrator.migrate_tables( File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-539676c8-28e2-41ee-b2a3-49961882350f/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/table_migrate.py", line 80, in migrate_tables all_principal_grants = None if acl_strategy is None else self._principal_grants.get_interactive_cluster_grants() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-539676c8-28e2-41ee-b2a3-49961882350f/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/grants.py", line 533, in get_interactive_cluster_grants mounts = list(self._mounts_crawler.snapshot()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-539676c8-28e2-41ee-b2a3-49961882350f/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/locations.py", line 252, in snapshot return self._snapshot(self._try_fetch, self._list_mounts) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-539676c8-28e2-41ee-b2a3-49961882350f/lib/python3.11/site-packages/databricks/labs/ucx/framework/crawlers.py", line 116, in _snapshot loaded_records = list(loader()) ^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-539676c8-28e2-41ee-b2a3-49961882350f/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/locations.py", line 247, in _list_mounts for mount_point, source, _ in self._dbutils.fs.mounts(): ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/databricks/python_shell/dbruntime/dbutils.py", line 362, in f_with_exception_handling return f(*args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "/databricks/python_shell/dbruntime/dbutils.py", line 497, in mounts self.print_return(self.dbcore.mounts()), MountInfo.create_from_jschema) ^^^^^^^^^^^^^^^^^^^^ File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1355, in __call__ return_value = get_return_value( ^^^^^^^^^^^^^^^^^ File "/databricks/spark/python/pyspark/errors/exceptions/captured.py", line 224, in deco return f(*a, **kw) ^^^^^^^^^^^ File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 330, in get_return_value raise Py4JError( py4j.protocol.Py4JError: An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750) 07:17 INFO [databricks.labs.blueprint.parallel:migrate_dbfs_root_non_delta_tables] listing grants for hive_metastore 4/4, rps: 0.127/sec 07:17 INFO [databricks.labs.blueprint.parallel:migrate_dbfs_root_non_delta_tables] Finished 'listing grants for hive_metastore' tasks: 100% results available (4/4). Took 0:00:32.088997 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.grants] found 1 new records for grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.groups] fetching groups inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.groups 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.groups] crawling new batch for groups 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_non_delta_tables] Listing workspace groups (resource_type=WorkspaceGroup) with id,displayName,meta,externalId,members,roles,entitlements... 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_non_delta_tables] GET /api/2.0/preview/scim/v2/Groups?attributes=id,displayName,meta,externalId,roles,entitlements&startIndex=1&count=100 < 200 OK < { < "Resources": [ < { < "displayName": "role.labs.tempo.write", < "externalId": "8d2db608-4ed1-49f2-ad5c-fe942be7a4e1", < "id": "22190446071900", < "meta": { < "resourceType": "Group" < } < }, < "... (58 additional elements)" < ], < "itemsPerPage": 59, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 1, < "totalResults": 59 < } 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_non_delta_tables] GET /api/2.0/preview/scim/v2/Groups?attributes=id,displayName,meta,externalId,roles,entitlements&startIndex=60&count=100 < 200 OK < { < "itemsPerPage": 0, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 60, < "totalResults": 59 < } 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_non_delta_tables] Found 0 WorkspaceGroup 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_non_delta_tables] Listing account groups with id,displayName,externalId... 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_non_delta_tables] GET /api/2.0/account/scim/v2/Groups?attributes=id,displayName,externalId < 200 OK < { < "Resources": [ < { < "displayName": "ucx_EMQk", < "id": "747915403144" < }, < { < "displayName": "rename-LFcF-ucx_GlGZb", < "id": "839447153138" < }, < "... (2192 additional elements)" < ], < "itemsPerPage": 2194, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 1, < "totalResults": 2194 < } 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_non_delta_tables] Found 2193 account groups 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_non_delta_tables] No group listing provided, all matching groups will be migrated 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.groups] found 0 new records for groups 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.groups (id_in_workspace STRING NOT NULL, nam... (179 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.tables] fetching tables inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.tables 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.tables] crawling new batch for tables 07:17 DEBUG [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_non_delta_tables] [hive_metastore.migrate_jnvkp] listing tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SHOW TABLES FROM hive_metastore.migrate_jnvkp 07:17 WARNING [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_non_delta_tables] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.tables] found 0 new records for tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.tables (catalog STRING NOT NULL, database ST... (222 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.mounts] fetching mounts inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_non_delta_tables] [spark][fetch] SELECT * FROM ucx_soysi.mounts 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_non_delta_tables] [hive_metastore.ucx_soysi.mounts] crawling new batch for mounts 07:17 ERROR [databricks.labs.ucx:migrate_dbfs_root_non_delta_tables] Execute `databricks workspace export //Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/logs/migrate-tables/run-786487235418832-0/migrate_dbfs_root_non_delta_tables.log` locally to troubleshoot with more details. An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750) 07:17 DEBUG [databricks:migrate_dbfs_root_non_delta_tables] Task crash details Traceback (most recent call last): File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-c9a7f46a-ac90-467f-a4e5-775c32215269/lib/python3.11/site-packages/databricks/labs/ucx/runtime.py", line 96, in trigger current_task(ctx) File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-c9a7f46a-ac90-467f-a4e5-775c32215269/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/workflows.py", line 42, in migrate_dbfs_root_non_delta_tables ctx.tables_migrator.migrate_tables( File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-c9a7f46a-ac90-467f-a4e5-775c32215269/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/table_migrate.py", line 80, in migrate_tables all_principal_grants = None if acl_strategy is None else self._principal_grants.get_interactive_cluster_grants() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-c9a7f46a-ac90-467f-a4e5-775c32215269/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/grants.py", line 533, in get_interactive_cluster_grants mounts = list(self._mounts_crawler.snapshot()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-c9a7f46a-ac90-467f-a4e5-775c32215269/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/locations.py", line 252, in snapshot return self._snapshot(self._try_fetch, self._list_mounts) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-c9a7f46a-ac90-467f-a4e5-775c32215269/lib/python3.11/site-packages/databricks/labs/ucx/framework/crawlers.py", line 116, in _snapshot loaded_records = list(loader()) ^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-c9a7f46a-ac90-467f-a4e5-775c32215269/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/locations.py", line 247, in _list_mounts for mount_point, source, _ in self._dbutils.fs.mounts(): ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/databricks/python_shell/dbruntime/dbutils.py", line 362, in f_with_exception_handling return f(*args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "/databricks/python_shell/dbruntime/dbutils.py", line 497, in mounts self.print_return(self.dbcore.mounts()), MountInfo.create_from_jschema) ^^^^^^^^^^^^^^^^^^^^ File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1355, in __call__ return_value = get_return_value( ^^^^^^^^^^^^^^^^^ File "/databricks/spark/python/pyspark/errors/exceptions/captured.py", line 224, in deco return f(*a, **kw) ^^^^^^^^^^^ File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 330, in get_return_value raise Py4JError( py4j.protocol.Py4JError: An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750) 07:17 INFO [databricks.labs.blueprint.parallel:migrate_dbfs_root_delta_tables] listing grants for hive_metastore 4/4, rps: 0.128/sec 07:17 INFO [databricks.labs.blueprint.parallel:migrate_dbfs_root_delta_tables] Finished 'listing grants for hive_metastore' tasks: 100% results available (4/4). Took 0:00:32.160964 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.grants] found 1 new records for grants 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.groups] fetching groups inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.groups 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.groups] crawling new batch for groups 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_delta_tables] Listing workspace groups (resource_type=WorkspaceGroup) with id,displayName,meta,externalId,members,roles,entitlements... 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_delta_tables] GET /api/2.0/preview/scim/v2/Groups?attributes=id,displayName,meta,externalId,roles,entitlements&startIndex=1&count=100 < 200 OK < { < "Resources": [ < { < "displayName": "role.labs.tempo.write", < "externalId": "8d2db608-4ed1-49f2-ad5c-fe942be7a4e1", < "id": "22190446071900", < "meta": { < "resourceType": "Group" < } < }, < "... (58 additional elements)" < ], < "itemsPerPage": 59, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 1, < "totalResults": 59 < } 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_delta_tables] GET /api/2.0/preview/scim/v2/Groups?attributes=id,displayName,meta,externalId,roles,entitlements&startIndex=60&count=100 < 200 OK < { < "itemsPerPage": 0, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 60, < "totalResults": 59 < } 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_delta_tables] Found 0 WorkspaceGroup 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_delta_tables] Listing account groups with id,displayName,externalId... 07:17 DEBUG [databricks.sdk:migrate_dbfs_root_delta_tables] GET /api/2.0/account/scim/v2/Groups?attributes=id,displayName,externalId < 200 OK < { < "Resources": [ < { < "displayName": "ucx_EMQk", < "id": "747915403144" < }, < { < "displayName": "rename-LFcF-ucx_GlGZb", < "id": "839447153138" < }, < "... (2192 additional elements)" < ], < "itemsPerPage": 2194, < "schemas": [ < "urn:ietf:params:scim:api:messages:2.0:ListResponse" < ], < "startIndex": 1, < "totalResults": 2194 < } 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_delta_tables] Found 2193 account groups 07:17 INFO [databricks.labs.ucx.workspace_access.groups:migrate_dbfs_root_delta_tables] No group listing provided, all matching groups will be migrated 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.groups] found 0 new records for groups 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.groups (id_in_workspace STRING NOT NULL, nam... (179 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.tables] fetching tables inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SELECT * FROM hive_metastore.ucx_soysi.tables 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.tables] crawling new batch for tables 07:17 DEBUG [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_delta_tables] [hive_metastore.migrate_jnvkp] listing tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SHOW TABLES FROM hive_metastore.migrate_jnvkp 07:17 WARNING [databricks.labs.ucx.hive_metastore.tables:migrate_dbfs_root_delta_tables] Schema hive_metastore.migrate_jnvkp no longer existed 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.tables] found 0 new records for tables 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][execute] CREATE TABLE IF NOT EXISTS hive_metastore.ucx_soysi.tables (catalog STRING NOT NULL, database ST... (222 more bytes) 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.mounts] fetching mounts inventory 07:17 DEBUG [databricks.labs.lsql.backends:migrate_dbfs_root_delta_tables] [spark][fetch] SELECT * FROM ucx_soysi.mounts 07:17 DEBUG [databricks.labs.ucx.framework.crawlers:migrate_dbfs_root_delta_tables] [hive_metastore.ucx_soysi.mounts] crawling new batch for mounts 07:17 ERROR [databricks.labs.ucx:migrate_dbfs_root_delta_tables] Execute `databricks workspace export //Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.SVtI/logs/migrate-tables/run-786487235418832-0/migrate_dbfs_root_delta_tables.log` locally to troubleshoot with more details. An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750) 07:17 DEBUG [databricks:migrate_dbfs_root_delta_tables] Task crash details Traceback (most recent call last): File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-62f71e94-0f44-488a-88e0-bdc7e54d4bdd/lib/python3.11/site-packages/databricks/labs/ucx/runtime.py", line 96, in trigger current_task(ctx) File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-62f71e94-0f44-488a-88e0-bdc7e54d4bdd/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/workflows.py", line 29, in migrate_dbfs_root_delta_tables ctx.tables_migrator.migrate_tables( File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-62f71e94-0f44-488a-88e0-bdc7e54d4bdd/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/table_migrate.py", line 80, in migrate_tables all_principal_grants = None if acl_strategy is None else self._principal_grants.get_interactive_cluster_grants() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-62f71e94-0f44-488a-88e0-bdc7e54d4bdd/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/grants.py", line 533, in get_interactive_cluster_grants mounts = list(self._mounts_crawler.snapshot()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-62f71e94-0f44-488a-88e0-bdc7e54d4bdd/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/locations.py", line 252, in snapshot return self._snapshot(self._try_fetch, self._list_mounts) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-62f71e94-0f44-488a-88e0-bdc7e54d4bdd/lib/python3.11/site-packages/databricks/labs/ucx/framework/crawlers.py", line 116, in _snapshot loaded_records = list(loader()) ^^^^^^^^ File "/local_disk0/.ephemeral_nfs/envs/pythonEnv-62f71e94-0f44-488a-88e0-bdc7e54d4bdd/lib/python3.11/site-packages/databricks/labs/ucx/hive_metastore/locations.py", line 247, in _list_mounts for mount_point, source, _ in self._dbutils.fs.mounts(): ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/databricks/python_shell/dbruntime/dbutils.py", line 362, in f_with_exception_handling return f(*args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "/databricks/python_shell/dbruntime/dbutils.py", line 497, in mounts self.print_return(self.dbcore.mounts()), MountInfo.create_from_jschema) ^^^^^^^^^^^^^^^^^^^^ File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1355, in __call__ return_value = get_return_value( ^^^^^^^^^^^^^^^^^ File "/databricks/spark/python/pyspark/errors/exceptions/captured.py", line 224, in deco return f(*a, **kw) ^^^^^^^^^^^ File "/databricks/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 330, in get_return_value raise Py4JError( py4j.protocol.Py4JError: An error occurred while calling o431.mounts. Trace: py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks.backend.daemon.dbutils.DBUtilsCore.mounts() is not whitelisted on class class com.databricks.backend.daemon.dbutils.DBUtilsCore at py4j.security.WhitelistingPy4JSecurityManager.checkCall(WhitelistingPy4JSecurityManager.java:473) at py4j.Gateway.invoke(Gateway.java:305) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:199) at py4j.ClientServerConnection.run(ClientServerConnection.java:119) at java.lang.Thread.run(Thread.java:750) 07:17 INFO [databricks.labs.ucx.installer.workflows] ---------- END REMOTE LOGS ---------- 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 make_storage_dir fixtures 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 1 make_dbfs_data_copy fixtures 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] removing make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/ceyw 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 cluster fixtures 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 1 catalog fixtures 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] removing catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716966219661, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_csp3b', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_csp3b', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716966219661, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] ignoring error while catalog CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1716966219661, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_csp3b', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_csp3b', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1716966219661, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') teardown: Catalog 'ucx_csp3b' does not exist. 07:17 INFO [databricks.labs.ucx.install] Deleting UCX v0.24.1+520240529071241 from https://DATABRICKS_HOST 07:17 INFO [databricks.labs.ucx.install] Deleting inventory database ucx_soysi 07:17 INFO [databricks.labs.ucx.install] Deleting jobs 07:17 INFO [databricks.labs.ucx.install] Deleting migrate-tables-in-mounts-experimental job_id=1116081221292417. 07:17 INFO [databricks.labs.ucx.install] Deleting migrate-external-hiveserde-tables-in-place-experimental job_id=1061208560613649. 07:17 INFO [databricks.labs.ucx.install] Deleting validate-groups-permissions job_id=675072411831628. 07:17 INFO [databricks.labs.ucx.install] Deleting migrate-external-tables-ctas job_id=146694479273217. 07:17 INFO [databricks.labs.ucx.install] Deleting migrate-groups-experimental job_id=743975846745174. 07:17 INFO [databricks.labs.ucx.install] Deleting migrate-groups job_id=474442253431598. 07:17 INFO [databricks.labs.ucx.install] Deleting migrate-tables job_id=353371736950474. 07:17 INFO [databricks.labs.ucx.install] Deleting remove-workspace-local-backup-groups job_id=633297211577913. 07:17 INFO [databricks.labs.ucx.install] Deleting scan-tables-in-mounts-experimental job_id=517348894816040. 07:17 INFO [databricks.labs.ucx.install] Deleting failing job_id=627783054808625. 07:17 INFO [databricks.labs.ucx.install] Deleting assessment job_id=1095034608926736. 07:17 INFO [databricks.labs.ucx.install] Deleting experimental-workflow-linter job_id=682298746054857. 07:17 INFO [databricks.labs.ucx.install] Deleting migrate-data-reconciliation job_id=375797917960136. 07:17 INFO [databricks.labs.ucx.install] Deleting cluster policy 07:17 ERROR [databricks.labs.ucx.install] UCX Policy already deleted 07:17 INFO [databricks.labs.ucx.install] Deleting secret scope 07:17 INFO [databricks.labs.ucx.install] UnInstalling UCX complete 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 workspace user fixtures 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 account group fixtures 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 workspace group fixtures 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 table fixtures 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 5 table fixtures 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_tpi07', metastore_id=None, name='ucx_tpi07', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_jnvkp/ucx_tpi07', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_t3h9e', metastore_id=None, name='ucx_t3h9e', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_gbYm', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_twraa', metastore_id=None, name='ucx_twraa', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/ceyw', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_t14l3', metastore_id=None, name='ucx_t14l3', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_jnvkp.ucx_tpi07', view_dependencies=None) 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_jnvkp.ucx_t7mog', metastore_id=None, name='ucx_t7mog', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024052908'}, row_filter=None, schema_name='migrate_jnvkp', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_jnvkp.ucx_t14l3', view_dependencies=None) 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 3 schema fixtures 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_jnvkp', metastore_id=None, name='migrate_jnvkp', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_csp3b', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_csp3b.migrate_jnvkp', metastore_id=None, name='migrate_jnvkp', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:17 DEBUG [databricks.labs.ucx.mixins.fixtures] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_soysi', metastore_id=None, name='ucx_soysi', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) [gw4] linux -- Python 3.10.14 /home/runner/work/ucx/ucx/.venv/bin/python ```

Running from nightly #75

github-actions[bot] commented 5 months ago
❌ test_table_migration_job_refreshes_migration_status[regular-migrate-tables]: databricks.sdk.errors.platform.InvalidParameterValue: Run 764948871968995 does not exist. (5m13.464s) ``` databricks.sdk.errors.platform.InvalidParameterValue: Run 764948871968995 does not exist. 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/Ze7y 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.migrate_xqvxu: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_xqvxu', metastore_id=None, name='migrate_xqvxu', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_ti9wt: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_ti9wt 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_ti9wt', metastore_id=None, name='ucx_ti9wt', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_xqvxu/ucx_ti9wt', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_txnkz: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_txnkz 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_txnkz', metastore_id=None, name='ucx_txnkz', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_1FIB', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_ttgdr: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_ttgdr 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_ttgdr', metastore_id=None, name='ucx_ttgdr', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/Ze7y', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_t3ava: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_t3ava 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_t3ava', metastore_id=None, name='ucx_t3ava', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_xqvxu.ucx_ti9wt', view_dependencies=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_tagcx: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_tagcx 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_tagcx', metastore_id=None, name='ucx_tagcx', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_xqvxu.ucx_t3ava', view_dependencies=None) 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1717052720344, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cu0oo', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cu0oo', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1717052720344, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Schema ucx_cu0oo.migrate_xqvxu: https://DATABRICKS_HOST/explore/data/ucx_cu0oo/migrate_xqvxu 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_cu0oo', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cu0oo.migrate_xqvxu', metastore_id=None, name='migrate_xqvxu', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_sm2og: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_sm2og 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_sm2og', metastore_id=None, name='ucx_sm2og', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) [gw3] linux -- Python 3.10.14 /home/runner/work/ucx/ucx/.venv/bin/python 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/Ze7y 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.migrate_xqvxu: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_xqvxu', metastore_id=None, name='migrate_xqvxu', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_ti9wt: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_ti9wt 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_ti9wt', metastore_id=None, name='ucx_ti9wt', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_xqvxu/ucx_ti9wt', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_txnkz: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_txnkz 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_txnkz', metastore_id=None, name='ucx_txnkz', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_1FIB', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_ttgdr: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_ttgdr 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_ttgdr', metastore_id=None, name='ucx_ttgdr', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/Ze7y', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_t3ava: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_t3ava 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_t3ava', metastore_id=None, name='ucx_t3ava', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_xqvxu.ucx_ti9wt', view_dependencies=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_tagcx: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_tagcx 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_tagcx', metastore_id=None, name='ucx_tagcx', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_xqvxu.ucx_t3ava', view_dependencies=None) 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1717052720344, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cu0oo', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cu0oo', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1717052720344, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Schema ucx_cu0oo.migrate_xqvxu: https://DATABRICKS_HOST/explore/data/ucx_cu0oo/migrate_xqvxu 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_cu0oo', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cu0oo.migrate_xqvxu', metastore_id=None, name='migrate_xqvxu', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_sm2og: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_sm2og 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_sm2og', metastore_id=None, name='ucx_sm2og', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:05 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Uj89/config.yml) doesn't exist. 07:05 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration 07:05 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data. 07:05 INFO [databricks.labs.ucx.install] Fetching installations... 07:05 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Uj89 is corrupted. Skipping... 07:05 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy. 07:05 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:05 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:05 INFO [databricks.labs.ucx.install] Installing UCX v0.24.1+1020240530070548 07:05 INFO [databricks.labs.ucx.install] Creating ucx schemas... 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=experimental-workflow-linter 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental 07:06 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Uj89/README for the next steps. 07:06 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-tables job: https://DATABRICKS_HOST#job/15254411028641 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/Ze7y 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.migrate_xqvxu: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_xqvxu', metastore_id=None, name='migrate_xqvxu', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_ti9wt: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_ti9wt 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_ti9wt', metastore_id=None, name='ucx_ti9wt', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_xqvxu/ucx_ti9wt', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_txnkz: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_txnkz 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_txnkz', metastore_id=None, name='ucx_txnkz', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_1FIB', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_ttgdr: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_ttgdr 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_ttgdr', metastore_id=None, name='ucx_ttgdr', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/Ze7y', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_t3ava: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_t3ava 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_t3ava', metastore_id=None, name='ucx_t3ava', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_xqvxu.ucx_ti9wt', view_dependencies=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Table hive_metastore.migrate_xqvxu.ucx_tagcx: https://DATABRICKS_HOST/explore/data/hive_metastore/migrate_xqvxu/ucx_tagcx 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_tagcx', metastore_id=None, name='ucx_tagcx', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_xqvxu.ucx_t3ava', view_dependencies=None) 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1717052720344, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cu0oo', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cu0oo', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1717052720344, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Schema ucx_cu0oo.migrate_xqvxu: https://DATABRICKS_HOST/explore/data/ucx_cu0oo/migrate_xqvxu 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_cu0oo', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cu0oo.migrate_xqvxu', metastore_id=None, name='migrate_xqvxu', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:05 INFO [databricks.labs.ucx.mixins.fixtures] Schema hive_metastore.ucx_sm2og: https://DATABRICKS_HOST/explore/data/hive_metastore/ucx_sm2og 07:05 DEBUG [databricks.labs.ucx.mixins.fixtures] added schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_sm2og', metastore_id=None, name='ucx_sm2og', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:05 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Uj89/config.yml) doesn't exist. 07:05 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration 07:05 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data. 07:05 INFO [databricks.labs.ucx.install] Fetching installations... 07:05 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Uj89 is corrupted. Skipping... 07:05 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy. 07:05 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:05 DEBUG [tests.integration.conftest] Waiting for clusters to start... 07:05 INFO [databricks.labs.ucx.install] Installing UCX v0.24.1+1020240530070548 07:05 INFO [databricks.labs.ucx.install] Creating ucx schemas... 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=experimental-workflow-linter 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions 07:06 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental 07:06 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Uj89/README for the next steps. 07:06 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-tables job: https://DATABRICKS_HOST#job/15254411028641 07:09 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 make_storage_dir fixtures 07:09 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 1 make_dbfs_data_copy fixtures 07:09 DEBUG [databricks.labs.ucx.mixins.fixtures] removing make_dbfs_data_copy fixture: dbfs:/mnt/TEST_MOUNT_NAME/a/b/Ze7y 07:09 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 cluster fixtures 07:09 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 1 catalog fixtures 07:09 DEBUG [databricks.labs.ucx.mixins.fixtures] removing catalog fixture: CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1717052720344, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cu0oo', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cu0oo', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1717052720344, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') 07:09 DEBUG [databricks.labs.ucx.mixins.fixtures] ignoring error while catalog CatalogInfo(browse_only=False, catalog_type=, comment='', connection_name=None, created_at=1717052720344, created_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cu0oo', isolation_mode=, metastore_id='8952c1e3-b265-4adf-98c3-6f755e2e1453', name='ucx_cu0oo', options=None, owner='0a330eb5-dd51-4d97-b6e4-c474356b1d5d', properties=None, provider_name=None, provisioning_info=None, securable_kind=, securable_type='CATALOG', share_name=None, storage_location=None, storage_root=None, updated_at=1717052720344, updated_by='0a330eb5-dd51-4d97-b6e4-c474356b1d5d') teardown: Catalog 'ucx_cu0oo' does not exist. 07:09 INFO [databricks.labs.ucx.install] Deleting UCX v0.24.1+1020240530070548 from https://DATABRICKS_HOST 07:09 ERROR [databricks.labs.ucx.install] Check if /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.Uj89 is present 07:09 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 workspace user fixtures 07:09 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 account group fixtures 07:09 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 workspace group fixtures 07:09 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 0 table fixtures 07:09 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 5 table fixtures 07:09 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_ti9wt', metastore_id=None, name='ucx_ti9wt', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location='dbfs:/user/hive/warehouse/migrate_xqvxu/ucx_ti9wt', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:09 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_txnkz', metastore_id=None, name='ucx_txnkz', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location='dbfs:/tmp/ucx_test_1FIB', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:10 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_ttgdr', metastore_id=None, name='ucx_ttgdr', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location='dbfs:/mnt/TEST_MOUNT_NAME/a/b/Ze7y', table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition=None, view_dependencies=None) 07:10 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_t3ava', metastore_id=None, name='ucx_t3ava', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_xqvxu.ucx_ti9wt', view_dependencies=None) 07:10 DEBUG [databricks.labs.ucx.mixins.fixtures] ignoring error while table TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_t3ava', metastore_id=None, name='ucx_t3ava', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_xqvxu.ucx_ti9wt', view_dependencies=None) teardown: [WRONG_COMMAND_FOR_OBJECT_TYPE] The operation DROP TABLE requires a EXTERNAL or MANAGED. But hive_metastore.migrate_xqvxu.ucx_t3ava is a VIEW. Use DROP VIEW instead. SQLSTATE: 42809 07:10 DEBUG [databricks.labs.ucx.mixins.fixtures] removing table fixture: TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_tagcx', metastore_id=None, name='ucx_tagcx', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_xqvxu.ucx_t3ava', view_dependencies=None) 07:10 DEBUG [databricks.labs.ucx.mixins.fixtures] ignoring error while table TableInfo(access_point=None, browse_only=None, catalog_name='hive_metastore', columns=None, comment=None, created_at=None, created_by=None, data_access_configuration_id=None, data_source_format=None, deleted_at=None, delta_runtime_properties_kvpairs=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, encryption_details=None, full_name='hive_metastore.migrate_xqvxu.ucx_tagcx', metastore_id=None, name='ucx_tagcx', owner=None, pipeline_id=None, properties={'RemoveAfter': '2024053008'}, row_filter=None, schema_name='migrate_xqvxu', sql_path=None, storage_credential_name=None, storage_location=None, table_constraints=None, table_id=None, table_type=, updated_at=None, updated_by=None, view_definition='SELECT * FROM hive_metastore.migrate_xqvxu.ucx_t3ava', view_dependencies=None) teardown: [WRONG_COMMAND_FOR_OBJECT_TYPE] The operation DROP TABLE requires a EXTERNAL or MANAGED. But hive_metastore.migrate_xqvxu.ucx_tagcx is a VIEW. Use DROP VIEW instead. SQLSTATE: 42809 07:10 DEBUG [databricks.labs.ucx.mixins.fixtures] clearing 3 schema fixtures 07:10 DEBUG [databricks.labs.ucx.mixins.fixtures] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.migrate_xqvxu', metastore_id=None, name='migrate_xqvxu', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:10 DEBUG [databricks.labs.ucx.mixins.fixtures] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='ucx_cu0oo', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='ucx_cu0oo.migrate_xqvxu', metastore_id=None, name='migrate_xqvxu', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) 07:10 DEBUG [databricks.labs.ucx.mixins.fixtures] removing schema fixture: SchemaInfo(browse_only=None, catalog_name='hive_metastore', catalog_type=None, comment=None, created_at=None, created_by=None, effective_predictive_optimization_flag=None, enable_predictive_optimization=None, full_name='hive_metastore.ucx_sm2og', metastore_id=None, name='ucx_sm2og', owner=None, properties=None, schema_id=None, storage_location=None, storage_root=None, updated_at=None, updated_by=None) [gw3] linux -- Python 3.10.14 /home/runner/work/ucx/ucx/.venv/bin/python ```

Running from nightly #76

ericvergnaud commented 5 months ago

Closing since https://github.com/databrickslabs/watchdog/pull/39 was released yesterday. Let's see if it gets recreated