I have installed a jar library on a Databricks cluster and during read I cannot use CDM connector any more.
Using this line of code
entity_df = (spark.read.format("com.microsoft.cdm") .option("storage", cdsStorageAccountName) .option("manifestPath", cdsContainer + manifest_path) .option("entity", table_name) .load()) display(entity_df)
I have installed a jar library on a Databricks cluster and during read I cannot use CDM connector any more. Using this line of code
entity_df = (spark.read.format("com.microsoft.cdm") .option("storage", cdsStorageAccountName) .option("manifestPath", cdsContainer + manifest_path) .option("entity", table_name) .load()) display(entity_df)
throws an error: _java.lang.NoClassDefFoundError: org/apache/spark/sql/connector/catalog/SupportsCatalogOptions Py4JJavaError Traceback (most recent call last)