Hi,
When using mounts - dbutils.fs.mount(...)- the plugin works perfectly with for example spark.read.format("com.github.saurfang.sas.spark").load("/mnt/mysasfile.sas7bdat")
But when using Cluster-scoped or Session-scoped Service principals without mounts, the plug-in does not seem to respect that and I get an error equivalent as to not passing the SP credentials at all. (pattern n.4 here -> https://github.com/hurtn/datalake-ADLS-access-patterns-with-Databricks/blob/master/readme.md)
spark.read.format("com.github.saurfang.sas.spark") .load("abfss:://mycontainer@mystorageAcct..dfs.core.windows.net/mysasfile.sas7bdat")
KeyProviderException: Failure to initialize configuration
Caused by: InvalidConfigurationValueException: Invalid configuration value detected for fs.azure.account.key
Any possibility of having the plugin working with cluster-scoped mounts?
Hi, When using mounts - dbutils.fs.mount(...)- the plugin works perfectly with for example
spark.read.format("com.github.saurfang.sas.spark").load("/mnt/mysasfile.sas7bdat")
But when using Cluster-scoped or Session-scoped Service principals without mounts, the plug-in does not seem to respect that and I get an error equivalent as to not passing the SP credentials at all. (pattern n.4 here -> https://github.com/hurtn/datalake-ADLS-access-patterns-with-Databricks/blob/master/readme.md)spark.read.format("com.github.saurfang.sas.spark") .load("abfss:://mycontainer@mystorageAcct..dfs.core.windows.net/mysasfile.sas7bdat")
Any possibility of having the plugin working with cluster-scoped mounts?