googleapis / google-cloud-java

Google Cloud Client Library for Java
https://cloud.google.com/java/docs/reference
Apache License 2.0
1.89k stars 1.06k forks source link

java.lang.NoClassDefFoundError: org/spark-project/jetty/alpn/ALPN$Provider when instantiating MetricServiceClient #2414

Closed Berg3 closed 7 years ago

Berg3 commented 7 years ago

When trying to instantiate a MetricServiceClient in order to perform Google Stackdriver Monitoring using the following code in a DataProc cluster image 1.0:

MetricServiceSettings settings = MetricServiceSettings.defaultBuilder()
                .setCredentialsProvider(credentialsProvider)
                .build();
MetricServiceClient metricServiceClient =  MetricServiceClient.create(settings);

I get the following stack trace:

Exception in thread "main" java.lang.NoClassDefFoundError: org/spark-project/jetty/alpn/ALPN$Provider
    at io.netty.handler.ssl.JdkAlpnApplicationProtocolNegotiator$1.<init>(JdkAlpnApplicationProtocolNegotiator.java:26)
    at io.netty.handler.ssl.JdkAlpnApplicationProtocolNegotiator.<clinit>(JdkAlpnApplicationProtocolNegotiator.java:24)
    at io.netty.handler.ssl.JdkSslContext.toNegotiator(JdkSslContext.java:237)
    at io.netty.handler.ssl.JdkSslClientContext.<init>(JdkSslClientContext.java:189)
    at io.netty.handler.ssl.SslContext.newClientContextInternal(SslContext.java:729)
    at io.netty.handler.ssl.SslContextBuilder.build(SslContextBuilder.java:223)
    at io.grpc.netty.NettyChannelBuilder$NettyTransportFactory.<init>(NettyChannelBuilder.java:470)
    at io.grpc.netty.NettyChannelBuilder.buildTransportFactory(NettyChannelBuilder.java:338)
    at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:305)
    at com.google.api.gax.grpc.InstantiatingChannelProvider.createChannel(InstantiatingChannelProvider.java:125)
    at com.google.api.gax.grpc.InstantiatingChannelProvider.getChannel(InstantiatingChannelProvider.java:110)
    at com.google.api.gax.grpc.GrpcTransportProvider.getTransport(GrpcTransportProvider.java:98)
    at com.google.api.gax.grpc.GrpcTransportProvider.getTransport(GrpcTransportProvider.java:59)
    at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:97)
    at com.google.cloud.monitoring.v3.stub.GrpcMetricServiceStub.create(GrpcMetricServiceStub.java:161)
    at com.google.cloud.monitoring.v3.MetricServiceSettings.createStub(MetricServiceSettings.java:195)
    at com.google.cloud.monitoring.v3.MetricServiceClient.<init>(MetricServiceClient.java:140)
    at com.google.cloud.monitoring.v3.MetricServiceClient.create(MetricServiceClient.java:122)

This is the dependency in the pom.xml file (spark version 1.6.3):

  <dependency>
      <groupId>com.google.cloud</groupId>
      <artifactId>google-cloud-monitoring</artifactId>
      <version>0.22.0-alpha</version>
      <exclusions>
        <exclusion>
          <groupId>com.google.guava</groupId>
          <artifactId>guava</artifactId>
        </exclusion>
      </exclusions>
    </dependency>
  <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-sql_2.10</artifactId>
      <version>${spark.version}</version>
      <exclusions>
        <exclusion>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-client</artifactId>
        </exclusion>
        <exclusion>
          <groupId>org.apache.commons</groupId>
          <artifactId>commons-lang3</artifactId>
        </exclusion>
        <exclusion>
          <groupId>net.java.dev.jets3t</groupId>
          <artifactId>jets3t</artifactId>
        </exclusion>
        <exclusion>
          <groupId>com.google.guava</groupId>
          <artifactId>guava</artifactId>
        </exclusion>
      </exclusions>
      <!--<scope>provided</scope>-->
    </dependency>

    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-catalyst_2.10</artifactId>
      <version>${spark.version}</version>
      <exclusions>
        <exclusion>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-client</artifactId>
        </exclusion>
        <exclusion>
          <groupId>org.apache.commons</groupId>
          <artifactId>commons-lang3</artifactId>
        </exclusion>
        <exclusion>
          <groupId>net.java.dev.jets3t</groupId>
          <artifactId>jets3t</artifactId>
        </exclusion>
        <exclusion>
          <groupId>com.google.guava</groupId>
          <artifactId>guava</artifactId>
        </exclusion>
      </exclusions>
      <!--<scope>provided</scope>-->
    </dependency>

    <dependency>
      <groupId>com.databricks</groupId>
      <artifactId>spark-csv_2.11</artifactId>
      <version>${spark.csv.version}</version>
      <exclusions>
        <exclusion>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-client</artifactId>
        </exclusion>
        <exclusion>
          <groupId>org.apache.commons</groupId>
          <artifactId>commons-lang3</artifactId>
        </exclusion>
        <exclusion>
          <groupId>net.java.dev.jets3t</groupId>
          <artifactId>jets3t</artifactId>
        </exclusion>
        <exclusion>
          <groupId>com.google.guava</groupId>
          <artifactId>guava</artifactId>
        </exclusion>
      </exclusions>
    </dependency>

Any advice as to how to resolve this issue would be appreciated. Thanks.

Edit: I have no issue when running my program locally, the issue only appears when running on the Dataproc cluster.

michaelbausor commented 7 years ago

Hi @Berg3, a couple of things you could try:

michaelbausor commented 7 years ago

Something else you could try is to check and compare what version of netty is installed locally and on the Dataproc cluster. You can use the command mvn dependency:tree to get a list of dependencies for your project. You can see only the netty dependencies with mvn dependency:tree | grep io.netty

If you find a difference in the netty version, you make want to try explicitly specifying dependencies on relevant io.netty packages in your pom to make sure the correct version is being used.

Berg3 commented 7 years ago

As I did some more digging, I simplified what I was trying to do to try to isolate the issue since my program was running in a much larger established codebase, and it turns out that I am having I think essentially the same issue as was in the post @michaelbausor mentioned: https://github.com/GoogleCloudPlatform/google-cloud-java/issues/2266 So I looked at the post and on the dataproc cluster I followed the steps here: https://github.com/GoogleCloudPlatform/google-cloud-java/tree/master/google-cloud-util/google-cloud-compat-checker and I get the following output:

OS details:
  os.detected.name: linux
  os.detected.arch: x86_64
  os.detected.classifier: linux-x86_64
  os.detected.release: debian
  os.detected.release.version: 8
JVM details:
  Java version: 1.8.0_131
  Java specification version: 1.8
  JVM bit mode: 64
OpenSSL details:
  open ssl is available: true
  ALPN is supported: true
Checking compatibility...
  [PASS] This OS + architecture is supported.
  [PASS] 64-bit JVM is supported.
  [PASS] Open SSL is available
  [PASS] Open SSL ALPN is supported
Result: UNKNOWN (checker implementation not complete)
  Based on what was checked, nothing was identified that would
  prevent you from using grpc-based APIs.

but when running the simplified program I still get the error:

Exception in thread "main" java.lang.IllegalArgumentException: Jetty ALPN/NPN has not been properly configured.
    at io.grpc.netty.GrpcSslContexts.selectApplicationProtocolConfig(GrpcSslContexts.java:159)
    at io.grpc.netty.GrpcSslContexts.configure(GrpcSslContexts.java:136)
    at io.grpc.netty.GrpcSslContexts.configure(GrpcSslContexts.java:124)
    at io.grpc.netty.GrpcSslContexts.forClient(GrpcSslContexts.java:94)
    at io.grpc.netty.NettyChannelBuilder$NettyTransportFactory$DefaultNettyTransportCreationParamsFilterFactory.<init>(NettyChannelBuilder.java:525)
    at io.grpc.netty.NettyChannelBuilder$NettyTransportFactory$DefaultNettyTransportCreationParamsFilterFactory.<init>(NettyChannelBuilder.java:518)
    at io.grpc.netty.NettyChannelBuilder$NettyTransportFactory.<init>(NettyChannelBuilder.java:457)
    at io.grpc.netty.NettyChannelBuilder.buildTransportFactory(NettyChannelBuilder.java:326)
    at io.grpc.internal.AbstractManagedChannelImplBuilder.build(AbstractManagedChannelImplBuilder.java:315)
    at com.google.api.gax.grpc.InstantiatingChannelProvider.createChannel(InstantiatingChannelProvider.java:131)
    at com.google.api.gax.grpc.InstantiatingChannelProvider.getChannel(InstantiatingChannelProvider.java:116)
    at com.google.api.gax.grpc.GrpcTransportProvider.getTransport(GrpcTransportProvider.java:98)
    at com.google.api.gax.grpc.GrpcTransportProvider.getTransport(GrpcTransportProvider.java:59)
    at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:97)
    at com.google.cloud.monitoring.v3.stub.GrpcMetricServiceStub.create(GrpcMetricServiceStub.java:161)
    at com.google.cloud.monitoring.v3.MetricServiceSettings.createStub(MetricServiceSettings.java:195)
    at com.google.cloud.monitoring.v3.MetricServiceClient.<init>(MetricServiceClient.java:159)
    at com.google.cloud.monitoring.v3.MetricServiceClient.create(MetricServiceClient.java:141)

from running the command:

MetricServiceSettings settings = MetricServiceSettings.newBuilder()
                .setCredentialsProvider(credentialsProvider)
                .build();

MetricServiceClient metricServiceClient =  MetricServiceClient.create(settings);

where the create(settings) command causes the error

This is the output I get when running the command:

mvn dependency:tree | grep io.netty
[INFO] |  +- io.netty:netty-tcnative-boringssl-static:jar:2.0.3.Final:compile
[INFO] |  |  +- io.netty:netty-codec-http2:jar:4.1.14.Final:compile (version selected from constraint [4.1.14.Final,4.1.14.Final])
[INFO] |  |  |  +- io.netty:netty-codec-http:jar:4.1.14.Final:compile
[INFO] |  |  |  |  \- io.netty:netty-codec:jar:4.1.14.Final:compile
[INFO] |  |  |  \- io.netty:netty-handler:jar:4.1.14.Final:compile
[INFO] |  |  |     \- io.netty:netty-buffer:jar:4.1.14.Final:compile
[INFO] |  |  |        \- io.netty:netty-common:jar:4.1.14.Final:compile
[INFO] |  |  \- io.netty:netty-handler-proxy:jar:4.1.14.Final:compile
[INFO] |  |     +- io.netty:netty-transport:jar:4.1.14.Final:compile
[INFO] |  |     |  \- io.netty:netty-resolver:jar:4.1.14.Final:compile
[INFO] |  |     \- io.netty:netty-codec-socks:jar:4.1.14.Final:compile
[INFO] |  +- io.netty:netty-all:jar:4.0.43.Final:compile
[INFO] |  +- io.netty:netty:jar:3.9.9.Final:compile
Berg3 commented 7 years ago

After doing a little more digging, I decided to make sure that OpenSSL is available directly in my project, so I added the following check:

System.out.println("OpenSsl.isAvailable: " + OpenSsl.isAvailable());
if (!OpenSsl.isAvailable()) {
    OpenSsl.unavailabilityCause().printStackTrace();
}
System.out.println("OpenSsl.isAlpnSupported: " + OpenSsl.isAlpnSupported());

directly before I instantiate the metric service client. Locally, I get the following print statements printed to console:

OpenSsl.isAvailable: true
OpenSsl.isAlpnSupported: true

however, despite passing the google-cloud-compat-checker as I mentioned in the previous comment, when I run my program on the dataproc cluster, I get the following error:

OpenSsl.isAvailable: false
java.lang.ClassNotFoundException: org.apache.tomcat.jni.SSL
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at io.netty.handler.ssl.OpenSsl.<clinit>(OpenSsl.java:87)
    at com.apigee.analytics.platform.StackdriverController.sendUapStatsToStackdriver(StackdriverController.java:121)
    at com.apigee.analytics.platform.StackdriverController.main(StackdriverController.java:90)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
OpenSsl.isAlpnSupported: false

I probably should mention that locally, I am running via IntelliJ whereas on the dataproc cluster I am running the class through a spark-submit job

Berg3 commented 7 years ago

The issue was because I was running the job as a spark-submit job without specifying:

--conf spark.driver.userClassPathFirst=true \
--conf spark.executor.userClassPathFirst=true \

so the wrong io.netty jar files were used