Closed cpdatabricks closed 2 years ago
I had the latest version of Java installed instead of Java 8.
Happens to the best of us! Thanks for updating.
how was this issue resolved??
Problem was on user code, if you have similar trouble please open a new issue @android2600
I'm getting this error when using the kedro run command for the pyspark-iris starter. I'm using the latest MacBook Pro with the M1 chip.
(base) user.name@HY923X2J6Y iris-databricks % kedro run 2022-04-27 13:53:18,734 - kedro.framework.cli.hooks.manager - INFO - Registered CLI hooks from 1 installed plugin(s): kedro-telemetry-0.2.0 Kedro-Telemetry is installed, but you have opted out of sharing usage analytics so none will be collected. 2022-04-27 13:53:18,767 - kedro.framework.session.store - INFO -
sys.exit(main())
File "/opt/anaconda3/lib/python3.9/site-packages/kedro/framework/cli/cli.py", line 206, in main
cli_collection()
File "/opt/anaconda3/lib/python3.9/site-packages/click/core.py", line 1128, in call
return self.main(args, kwargs)
File "/opt/anaconda3/lib/python3.9/site-packages/kedro/framework/cli/cli.py", line 141, in main
super().main(
File "/opt/anaconda3/lib/python3.9/site-packages/click/core.py", line 1053, in main
rv = self.invoke(ctx)
File "/opt/anaconda3/lib/python3.9/site-packages/click/core.py", line 1659, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/opt/anaconda3/lib/python3.9/site-packages/click/core.py", line 1395, in invoke
return ctx.invoke(self.callback, ctx.params)
File "/opt/anaconda3/lib/python3.9/site-packages/click/core.py", line 754, in invoke
return __callback(args, kwargs)
File "/opt/anaconda3/lib/python3.9/site-packages/kedro/framework/cli/project.py", line 352, in run
session.run(
File "/opt/anaconda3/lib/python3.9/site-packages/kedro/framework/session/session.py", line 354, in run
context = self.load_context()
File "/opt/anaconda3/lib/python3.9/site-packages/kedro/framework/session/session.py", line 251, in load_context
context = context_class(
File "/Users/cara.phillips/iris-databricks/src/iris_databricks/context.py", line 27, in init
self.init_spark_session()
File "/Users/cara.phillips/iris-databricks/src/iris_databricks/context.py", line 44, in init_spark_session
_spark_session = spark_session_conf.getOrCreate()
File "/opt/anaconda3/lib/python3.9/site-packages/pyspark/sql/session.py", line 228, in getOrCreate
sc = SparkContext.getOrCreate(sparkConf)
File "/opt/anaconda3/lib/python3.9/site-packages/pyspark/context.py", line 392, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "/opt/anaconda3/lib/python3.9/site-packages/pyspark/context.py", line 146, in init
self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
File "/opt/anaconda3/lib/python3.9/site-packages/pyspark/context.py", line 209, in _do_init
self._jsc = jsc or self._initialize_context(self._conf._jconf)
File "/opt/anaconda3/lib/python3.9/site-packages/pyspark/context.py", line 329, in _initialize_context
return self._jvm.JavaSparkContext(jconf)
File "/opt/anaconda3/lib/python3.9/site-packages/py4j/java_gateway.py", line 1585, in call
return_value = get_return_value(
File "/opt/anaconda3/lib/python3.9/site-packages/py4j/protocol.py", line 326, in get_return_value
raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x15fef0a0) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x15fef0a0
at org.apache.spark.storage.StorageUtils$.(StorageUtils.scala:213)
at org.apache.spark.storage.StorageUtils$.(StorageUtils.scala)
at org.apache.spark.storage.BlockManagerMasterEndpoint.(BlockManagerMasterEndpoint.scala:110)
at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:348)
at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:287)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:336)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:191)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
at org.apache.spark.SparkContext.(SparkContext.scala:460)
at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)
at java.base/jdk.internal.reflect.DirectConstructorHandleAccessor.newInstance(DirectConstructorHandleAccessor.java:67)
at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:483)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:238)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
at java.base/java.lang.Thread.run(Thread.java:833)
read()
not implemented forBaseSessionStore
. Assuming empty store. 2022-04-27 13:53:18,863 - kedro.framework.session.session - INFO - Kedro project iris-databricks Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 22/04/27 13:53:20 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2022-04-27 13:53:21,136 - kedro.framework.session.store - INFO -save()
not implemented forBaseSessionStore
. Skipping the step. Traceback (most recent call last): File "/opt/anaconda3/bin/kedro", line 8, in