Closed dynarch closed 2 years ago
You can't use credential passthrough in non-interactive mode (e.g. in a scheduled task), and that takes precedence over SP credentials provided in Spark config.
I have received a reply from a Databricks team, they have informed that a credentials pass-through cannot be used in a scheduled tasks, so the problem is in a Databricks, not library.
Hello,
I have a problem with a CDM connector running in a batch mode (workflow). When running manually it works with no error. When running as a scheduled task this part of code
throws an error: com.databricks.backend.daemon.data.client.adl.AzureCredentialNotFoundException: Could not find ADLS Gen2 Token
_--- Py4JJavaError Traceback (most recent call last)