I'm trying to read/write into a data lake storage using databricks but I'm getting the error below.
I'm contributor in the storage account, I'm using a sasToken and credential passthrough are enabled in the cluster. Do I need to setup a service principal as well?
Thanks
Diego
StatusCode=403
StatusDescription=This request is not authorized to perform this operation using this permission.
ErrorCode=
ErrorMessage=
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:137)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.services.AbfsClient.getPathProperties(AbfsClient.java:396)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.getFileStatus(AzureBlobFileSystemStore.java:584)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.getFileStatus(AzureBlobFileSystem.java:437)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426)
at com.microsoft.cdm.utils.CDMModelCommon.getManifest(CDMModelCommon.scala:266)
at com.microsoft.cdm.utils.CDMModelCommon.entityExists(CDMModelCommon.scala:274)
at com.microsoft.cdm.write.CDMDataSourceWriter.<init>(CDMDataSourceWriter.scala:73)
at com.microsoft.cdm.DefaultSource.createWriter(DefaultSource.scala:295)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:277)
at line3c4a85a5376c45d1b60a69529adb87bc44.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-551037868290482:16)
at line3c4a85a5376c45d1b60a69529adb87bc44.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-551037868290482:66)
at line3c4a85a5376c45d1b60a69529adb87bc44.$read$$iw$$iw$$iw$$iw.<init>(command-551037868290482:68)
at line3c4a85a5376c45d1b60a69529adb87bc44.$read$$iw$$iw$$iw.<init>(command-551037868290482:70)
at line3c4a85a5376c45d1b60a69529adb87bc44.$read$$iw$$iw.<init>(command-551037868290482:72)
at line3c4a85a5376c45d1b60a69529adb87bc44.$read$$iw.<init>(command-551037868290482:74)
at line3c4a85a5376c45d1b60a69529adb87bc44.$read.<init>(command-551037868290482:76)
at line3c4a85a5376c45d1b60a69529adb87bc44.$read$.<init>(command-551037868290482:80)
at line3c4a85a5376c45d1b60a69529adb87bc44.$read$.<clinit>(command-551037868290482)
at line3c4a85a5376c45d1b60a69529adb87bc44.$eval$.$print$lzycompute(<notebook>:7)
at line3c4a85a5376c45d1b60a69529adb87bc44.$eval$.$print(<notebook>:6)
at line3c4a85a5376c45d1b60a69529adb87bc44.$eval.$print(<notebook>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215)
at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply$mcV$sp(ScalaDriverLocal.scala:202)
at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:714)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:667)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202)
at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:396)
at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:373)
at com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:49)
at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:275)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:49)
at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:373)
at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
at scala.util.Try$.apply(Try.scala:192)
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:639)
at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:485)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:597)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:390)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)
at java.lang.Thread.run(Thread.java:748)
Hi
I'm trying to read/write into a data lake storage using databricks but I'm getting the error below.
I'm contributor in the storage account, I'm using a sasToken and credential passthrough are enabled in the cluster. Do I need to setup a service principal as well?
Thanks Diego