Closed Vosec closed 6 months ago
Im pointing to this source code for example: original_dataflow_df = self.spark.read.format("delta").load(dict_obj["silver_dataflowspec_path"]) If the tables in UC are stored in external location, then the users need READ FILES permission to this external location.
Could this be replaced with just spark.sql("select ...") to that table? So there wont be any permission issues and paths configuration?
@Vosec dlt-meta does not support UC, we will be making new release for UC support.
@ravi-databricks Thank you very much for the response. Do you have rough estimate for the release?
What's the timeline on this getting added with support for UC?
AS of now there is working branch for [UC support] (https://github.com/databrickslabs/dlt-meta/tree/feature/dlt-meta-uc). we will plan to release UC branch in coming quarter.
Just update on UC feature: There is PR#28 which will be merged soon to main.
UC Support is released to main branch
Hello,
if the dataflowspecs tables + bronze/silver tables are stored in schema managed by UC, what is the reason to use/set paths for these tables? I'm unable to create dataflowspec tables if I want them to be stored in schema managed by UC. If I store it in hive_metastore, it runs fine.
The main issue is that users don't know the paths for the tables managed by UC, they should only reference the tables names in specific schema. Or am I missing something?
"silver_dataflowspec_table": "silver_dataflowspec_table", "silver_dataflowspec_path": "dbfs:/onboarding_tables_cdc/silver", "bronze_dataflowspec_table": "bronze_dataflowspec_table", "bronze_dataflowspec_path": "dbfs:/onboarding_tables_cdc/bronze",
Thanks.