solliancenet / microsoft-learning-paths-databricks-notebooks

Contains notebooks used in the Microsoft Azure Databricks Learning Paths modules.
173 stars 173 forks source link

MatchError: [Ljava.lang.String;@171b8379 (of class [Ljava.lang.String;) #12

Open bonesclarke opened 1 year ago

bonesclarke commented 1 year ago

Hello, I am trying to go through these notebooks for a course but keep getting a match error when running %run "./Includes/Classroom-Setup".

Below is the full error: MatchError: [Ljava.lang.String;@171b8379 (of class [Ljava.lang.String;) at $line670e943a5f5a4a6d83075abe7bb6491425.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>( command-583985868174312:51) at $line670e943a5f5a4a6d83075abe7bb6491425.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-583985868174312:566) at $line670e943a5f5a4a6d83075abe7bb6491425.$read$$iw$$iw$$iw$$iw.<init>(command-583985868174312:568) at $line670e943a5f5a4a6d83075abe7bb6491425.$read$$iw$$iw$$iw.<init>(command-583985868174312:570) at $line670e943a5f5a4a6d83075abe7bb6491425.$read$$iw$$iw.<init>(command-583985868174312:572) at $line670e943a5f5a4a6d83075abe7bb6491425.$read$$iw.<init>(command-583985868174312:574) at $line670e943a5f5a4a6d83075abe7bb6491425.$read.<init>(command-583985868174312:576) at $line670e943a5f5a4a6d83075abe7bb6491425.$read$.<init>(command-583985868174312:580) at $line670e943a5f5a4a6d83075abe7bb6491425.$read$.<clinit>(command-583985868174312) at $line670e943a5f5a4a6d83075abe7bb6491425.$eval$.$print$lzycompute(<notebook>:7) at $line670e943a5f5a4a6d83075abe7bb6491425.$eval$.$print(<notebook>:6) at $line670e943a5f5a4a6d83075abe7bb6491425.$eval.$print(<notebook>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747) at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020) at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:568) at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36) at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116) at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41) at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:567) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564) at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:223) at com.databricks.backend.daemon.driver.ScalaDriverLocal.$anonfun$repl$1(ScalaDriverLocal.scala:236) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:1393) at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:1346) at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:236) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$33(DriverLocal.scala:997) at com.databricks.unity.EmptyHandle$.runWith(UCSHandle.scala:124) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$22(DriverLocal.scala:980) at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:426) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:196) at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:424) at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:418) at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:69) at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:470) at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:455) at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:69) at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:935) at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:798) at scala.util.Try$.apply(Try.scala:213) at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:790) at com.databricks.backend.daemon.driver.DriverWrapper.executeCommandAndGetError(DriverWrapper.scala:643) at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:744) at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:520) at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:436) at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:279) at java.lang.Thread.run(Thread.java:750) Command skipped Command skipped Command skipped Command skipped Command skipped Command skipped

frankl1 commented 9 months ago

Same here

alexeherron commented 7 months ago

Same

Cam-B04 commented 5 months ago

Same

aadard commented 5 months ago

I had the same problem. It is caused by this line in Includes/Dataset-Mounts: val Array(dbrMajorVersion, dbrMinorVersion, *) = dbrVersion.split(""".""")_

It is trying the get the major and minor version but fails in some cases. I kept the code as is and create a new cluster with a different runtime (9.1 LTS, Scala 2.12, Spark 3.1.2) and attached this cluster to the notebook. It now works. Not the best solution but allows you to proceed at least. Hope that helps.

bodasai1787 commented 3 months ago

Same problem here

Stivan93 commented 3 months ago

... I kept the code as is and create a new cluster with a different runtime (9.1 LTS, Scala 2.12, Spark 3.1.2) and attached this cluster to the notebook. It now works.

Thank you for this. I was also having difficulty creating a cluster resource due to every SKU I tried not being available in my location (US EAST) or for my license, and eventually queried Azure shell to find that Standard_DC4as_v5 works.

The query: az vm list-skus --location eastus --size Standard_D --all --output table