awslabs / aws-glue-libs

AWS Glue Libraries are additions and enhancements to Spark for ETL operations.
Other
635 stars 300 forks source link

Error when running container amazon/aws-glue-libs:glue_libs_1.0.0_image_01 #126

Closed siqueirarenan closed 2 years ago

siqueirarenan commented 2 years ago

Dear all,

we are facing some problems when trying to run any spark command using the official AWS container for Glue V1. We are following this documentation https://aws.amazon.com/blogs/big-data/developing-aws-glue-etl-jobs-locally-using-a-container/.

When we simply try to run the command spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/bin/pyspark to run the Spark CLI, we receive the following error.

 Python 3.6.15 (default, Dec  3 2021, 03:07:35)
 [GCC 10.2.1 20210110] on linux
 Type "help", "copyright", "credits" or "license" for more information.
 22/01/25 21:03:02 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes           where applicable
 Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
 Setting default log level to "WARN".
 To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
 /home/spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/python/pyspark/sql/session.py:580: UserWarning: Fall back to non-hive support because failing to access HiveConf, please make sure you build spark with hive
   warnings.warn("Fall back to non-hive support because failing to access HiveConf, "
 22/01/25 21:03:03 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor).  This           may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created      at:
 org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
 sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
 sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
 sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 java.lang.reflect.Constructor.newInstance(Constructor.java:423)
 py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
 py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
 py4j.Gateway.invoke(Gateway.java:238)
 py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
 py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
 py4j.GatewayConnection.run(GatewayConnection.java:238)
 java.lang.Thread.run(Thread.java:748)
 /home/spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/python/pyspark/shell.py:45: UserWarning: Failed to initialize Spark session.
   warnings.warn("Failed to initialize Spark session.")
 Traceback (most recent call last):
   File "/home/spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/python/pyspark/shell.py", line 41, in <module>
     spark = SparkSession._create_shell_session()
   File "/home/spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/python/pyspark/sql/session.py", line 583, in _create_shell_session
     return SparkSession.builder.getOrCreate()
   File "/home/spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/python/pyspark/sql/session.py", line 173, in getOrCreate
     sc = SparkContext.getOrCreate(sparkConf)
   File "/home/spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/python/pyspark/context.py", line 367, in getOrCreate
     SparkContext(conf=conf or SparkConf())
   File "/home/spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/python/pyspark/context.py", line 136, in __init__
     conf, jsc, profiler_cls)
   File "/home/spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/python/pyspark/context.py", line 198, in _do_init
     self._jsc = jsc or self._initialize_context(self._conf._jconf)
   File "/home/spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/python/pyspark/context.py", line 306, in _initialize_context
     return self._jvm.JavaSparkContext(jconf)
   File "/home/spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1525, in __call__
     answer, self._gateway_client, None, self._fqn)
   File "/home/spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
     format(target_id, ".", name), value)
 py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
 : java.lang.NoSuchMethodError: io.netty.util.concurrent.SingleThreadEventExecutor.<init>     (Lio/netty/util/concurrent/EventExecutorGroup;Ljava/util/concurrent/Executor;ZLjava/util/Queue;Lio/netty/util/concurrent/RejectedExe     cutionHandler;)V
    at io.netty.channel.SingleThreadEventLoop.<init>(SingleThreadEventLoop.java:65)
    at io.netty.channel.nio.NioEventLoop.<init>(NioEventLoop.java:138)
    at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:146)
    at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:37)
    at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:84)
    at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:58)
    at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:47)
    at io.netty.channel.MultithreadEventLoopGroup.<init>(MultithreadEventLoopGroup.java:59)
    at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:86)
    at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:81)
    at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:68)
    at org.apache.spark.network.util.NettyUtils.createEventLoop(NettyUtils.java:50)
    at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:102)
    at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
    at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71)
    at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
    at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:238)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
    at py4j.GatewayConnection.run(GatewayConnection.java:238)
    at java.lang.Thread.run(Thread.java:748)
semihselcuk commented 2 years ago

I am having exactly the same issue. Is there any workaround for this problema as temp solution? I've tried some deleting of netty-* jars but nothing worked so far.

vstoyanoff commented 2 years ago

Same here as well. Wonder if it got updated or something else changed (like Docker version or something) because it was working a month ago and now it throws an error:

py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.NoSuchMethodError: io.netty.util.internal.ReflectionUtil.trySetAccessible(Ljava/lang/reflect/AccessibleObject;)Ljava/lang/Throwable;

Any help will be greatly appreciated

vstoyanoff commented 2 years ago

Update: I got it working by removing all netty jars from both aws-glue-libs and the spark jars folder except netty-all-<> jar.

semihselcuk commented 2 years ago

@vstoyanoff Thanks for the comment. I can confirm that it works once you run below script in your notebook.

%%bash find /home/spark-2.4.3-bin-spark-2.4.3-bin-hadoop2.8/jars/ -name "netty-*" ! -name 'netty-all*' -delete find /home/aws-glue-libs/jarsv1/ -name "netty-*" ! -name 'netty-all*' -delete

moomindani commented 2 years ago

Thank you for reporting this issue. We apologize for delay in response. Yes, it seems that the root cause is library conflict of netty jar files.

We resolved that issue in the Glue v2/v3 Docker images. Here's the blog post for that. https://aws.amazon.com/blogs/big-data/develop-and-test-aws-glue-version-3-0-jobs-locally-using-a-docker-container/

In case you still see the same issue in those newer images, please let us know. Thank you.