googleapis / google-cloud-java

Google Cloud Client Library for Java
https://cloud.google.com/java/docs/reference
Apache License 2.0
1.87k stars 1.06k forks source link

Authentication error after upgrading to 0.23.1 #2453

Closed lbergelson closed 5 years ago

lbergelson commented 6 years ago

We've started seeing an authentication error in our project after we upgraded to 0.23.1, the issue also seems to be present in 0.24.0. Reverting to 0.22.0 solves the issue.

We start seeing the following 404 error when running a spark application that uses NIO to access gcs files:

code:      0
message:   Error code 404 trying to get security access token from Compute Engine metadata for the default service account. This may be because the virtual machine instance does not have permission scopes specified.
reason:    null
location:  null
retryable: false
com.google.cloud.storage.StorageException: Error code 404 trying to get security access token from Compute Engine metadata for the default service account. This may be because the virtual machine instance does not have permission scopes specified.
    at com.google.cloud.storage.spi.v1.HttpStorageRpc.translate(HttpStorageRpc.java:189)
    at com.google.cloud.storage.spi.v1.HttpStorageRpc.get(HttpStorageRpc.java:339)
    at com.google.cloud.storage.StorageImpl$5.call(StorageImpl.java:197)
    at com.google.cloud.storage.StorageImpl$5.call(StorageImpl.java:194)
    at shaded.cloud_nio.com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:91)
    at com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:54)
    at com.google.cloud.storage.StorageImpl.get(StorageImpl.java:194)
    at com.google.cloud.storage.contrib.nio.CloudStorageFileSystemProvider.checkAccess(CloudStorageFileSystemProvider.java:614)
    at java.nio.file.Files.exists(Files.java:2385)
    at htsjdk.samtools.util.IOUtil.assertFileIsReadable(IOUtil.java:346)
    at org.broadinstitute.hellbender.engine.ReadsDataSource.<init>(ReadsDataSource.java:206)
    at org.broadinstitute.hellbender.engine.ReadsDataSource.<init>(ReadsDataSource.java:162)
    at org.broadinstitute.hellbender.engine.ReadsDataSource.<init>(ReadsDataSource.java:118)
    at org.broadinstitute.hellbender.engine.ReadsDataSource.<init>(ReadsDataSource.java:87)
    at org.broadinstitute.hellbender.engine.spark.datasources.ReadsSparkSource.getHeader(ReadsSparkSource.java:182)
    at org.broadinstitute.hellbender.engine.spark.GATKSparkTool.initializeReads(GATKSparkTool.java:390)
    at org.broadinstitute.hellbender.engine.spark.GATKSparkTool.initializeToolInputs(GATKSparkTool.java:370)
    at org.broadinstitute.hellbender.engine.spark.GATKSparkTool.runPipeline(GATKSparkTool.java:360)
    at org.broadinstitute.hellbender.engine.spark.SparkCommandLineProgram.doWork(SparkCommandLineProgram.java:38)
    at org.broadinstitute.hellbender.cmdline.CommandLineProgram.runTool(CommandLineProgram.java:119)
    at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMainPostParseArgs(CommandLineProgram.java:176)
    at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:195)
    at org.broadinstitute.hellbender.Main.runCommandLineProgram(Main.java:131)
    at org.broadinstitute.hellbender.Main.mainEntry(Main.java:152)
    at org.broadinstitute.hellbender.Main.main(Main.java:233)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.IOException: Error code 404 trying to get security access token from Compute Engine metadata for the default service account. This may be because the virtual machine instance does not have permission scopes specified.
    at shaded.cloud_nio.com.google.auth.oauth2.ComputeEngineCredentials.refreshAccessToken(ComputeEngineCredentials.java:137)
    at shaded.cloud_nio.com.google.auth.oauth2.OAuth2Credentials.refresh(OAuth2Credentials.java:160)
    at shaded.cloud_nio.com.google.auth.oauth2.OAuth2Credentials.getRequestMetadata(OAuth2Credentials.java:146)
    at shaded.cloud_nio.com.google.auth.http.HttpCredentialsAdapter.initialize(HttpCredentialsAdapter.java:96)
    at com.google.cloud.http.HttpTransportOptions$1.initialize(HttpTransportOptions.java:157)
    at shaded.cloud_nio.com.google.api.client.http.HttpRequestFactory.buildRequest(HttpRequestFactory.java:93)
    at shaded.cloud_nio.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.buildHttpRequest(AbstractGoogleClientRequest.java:300)
    at shaded.cloud_nio.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:419)
    at shaded.cloud_nio.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
    at shaded.cloud_nio.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
    at com.google.cloud.storage.spi.v1.HttpStorageRpc.get(HttpStorageRpc.java:337)
    ... 32 more
ERROR: (gcloud.dataproc.jobs.submit.spark) Job [cb87810a-0133-42b3-a954-363b62adce39] entered state [ERROR] while waiting for [DONE].

Looking at the dependency updates in this project, it seems like one of the auth libraries updated to version 0.8.0. Could that be the causing the issue?

Is there some new configuration setting we should be using in our gcloud project? Any help would be appreciated.

neozwu commented 6 years ago

@lbergelson from a quick scan of google-auth-client-java, it seems there's some code change regarding compute engine credential in version 0.7.1. Can you help check whether you could repro this issue with auth 0.7.1 ? (I saw you use gradle, so I assume simply declaring auth 0.7.1 as an explicit compile dependency should override auth 0.7.0)

lbergelson commented 6 years ago

@neozwu I tried running with 0.22.0 but forcing the version of auth to 0.7.1, and was unable to reproduce the error that way. I also tried forcing 0.8.0 and wasn't able to reproduce the problem that way either. So either I'm building it wrong or it's not the auth library. From running gradle dependencies it looks like the force is working though.

force 'com.google.auth:google-auth-library-oauth2-http:0.7.1'
force 'com.google.auth:google-auth-library-credentials:0.7.1'
droazen commented 6 years ago

@neozwu I was able to resolve the 404 error that @lbergelson reported by building a custom version of the latest master of google-cloud-java with the following patch applied:

diff --git a/pom.xml b/pom.xml
index 0a77a625b0..e0884bbf2d 100644
--- a/pom.xml
+++ b/pom.xml
@@ -131,10 +131,10 @@
     <api-client.version>1.22.0</api-client.version>

     <api-common.version>1.1.0</api-common.version>
-    <gax.version>1.8.1</gax.version>
-    <gax-grpc.version>0.25.1</gax-grpc.version>
+    <gax.version>1.8.0</gax.version>
+    <gax-grpc.version>0.25.0</gax-grpc.version>
     <generatedProto.version>0.1.19</generatedProto.version>
-    <google.auth.version>0.8.0</google.auth.version>
+    <google.auth.version>0.7.0</google.auth.version>
     <grpc.version>1.6.1</grpc.version>
     <guava.version>20.0</guava.version>
     <http-client.version>1.22.0</http-client.version>

So, it's very likely that the culprit is the google-auth-library-java change in 0.7.1 that @neozwu pointed out (https://github.com/google/google-auth-library-java/releases/tag/v0.7.1)

It looks like they moved from using a domain name for the METADATA_SERVER_URL to a hardcoded IP address (see https://github.com/google/google-auth-library-java/pull/110/files) -- I wonder if that is causing this somehow.

lbergelson commented 6 years ago

So it seems like the reason that I wasn't able to reproduce the issue by forcing 0.7.1 was that we're using the shaded jar, so the force statement wasn't actually replacing the dependencies.

droazen commented 6 years ago

Any suggestions on how we could go about dealing with this issue? Does anyone think that the hardcoded IP address added in https://github.com/google/google-auth-library-java/pull/110/files could be the cause?

lbergelson commented 6 years ago

Does anyone have any input on this one? It's important to us that we be able to upgrade.

neozwu commented 6 years ago

@lukecwik Could you comment on whether there could be any potential issues with hardcoded IP address for metadata server ? I saw you suggested an explicit IP vs domain name here.

lukecwik commented 6 years ago

GCE metadata server team specifically recommend using the IP address as a best practice to avoid DNS issues when contacting the metadata server.

Note that all of the Apiary client libraries have been using the fixed IP address for quite some time and to my knowledge have not experienced this issue.

The downside is that if the GCE metadata server team were to ever change the IP address of the metadata server, it would break everyone who hardcoded the address.

lbergelson commented 6 years ago

@neozwu @lukecwik Does anyone have any suggestions about how we can proceed? Is there some information we can provide that would be helpful for debugging this?

lukecwik commented 6 years ago

Since you have a complicated setup, it would be best if you tried executing a simple program which only invokes the google-auth-library-java 0.7.1 from within the environment without adding additional dependencies which would exclude your network/host setup from the failure scenario or shell out and use curl to contact the metadata server from your application and log what you get back using the IP address directly. If a much simpler application/shelling out works means that you have a dependency/configuration of java issue going on.

If you exclude the host/network setup from the failure condition, then comparing the detailed maven dependency tree of the working and broken versions could help. For every library that changed, try doing a strict override on each changed dependency (and only that changed dependency keeping all other transitive dependencies the same) to narrow it down to which one is causing the failure.

jean-philippe-martin commented 6 years ago

I was able to reproduce the same exact error running gatk PrintReadsSpark on a fresh Dataproc cluster with no special configuration applied. This suggests cluster/firewall misconfiguration may not be the problem.

My repro's very short:

package repro_package;

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
import com.google.cloud.storage.contrib.nio.CloudStorageFileSystem;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.ArrayList;

public class Main {

  final String cluster = "jps-test-cluster-2";

  public static void exists() {
    CloudStorageFileSystem fs = CloudStorageFileSystem.forBucket("jpmartin-testing-project");
    java.nio.file.Path p = fs.getPath("/hellbender-test-inputs/CEUTrio.HiSeq.WGS.b37.ch20.1m-2m.NA12878.bam");
    boolean x = Files.exists(p);
    System.out.println("" + p + " exists: " + x);
  }

  public static Integer runRemotely() {
    exists();
    System.out.println("Hello.");
    return 1;
  }

  public static void main(final String[] args) {
    exists();
    ArrayList<Integer> l = new ArrayList<Integer>();
    l.add(1);
    l.add(2);
    final SparkConf sparkConf = new SparkConf().setAppName("repro_package").setMaster("yarn-client");
    final JavaSparkContext ctx = new JavaSparkContext(sparkConf);
    ctx.parallelize(l, l.size()).map(i -> runRemotely());
    System.out.println("Done.");
  }
}
droazen commented 6 years ago

That's great @jean-philippe-martin -- now we're getting somewhere! Can you test whether the error goes away if you build a custom google-cloud-java shaded jar with google.auth.version downgraded to 0.7.0, as described in https://github.com/GoogleCloudPlatform/google-cloud-java/issues/2453#issuecomment-331276331?

lukecwik commented 6 years ago

Your repro package still seems like it brings in a bunch of dependencies via Spark/CloudStorageFilesystem.

From your simple application, try shelling out and sending a curl request by using the raw IP address and the non IP address version:

jean-philippe-martin commented 6 years ago

CloudStorageFilesystem is part of google-cloud-java, though, it counts. I can try directly shelling out but surely this is already covered by your tests and is going to work.

jean-philippe-martin commented 6 years ago

I can confirm that shelling out from the app works as expected (and so does directly ssh'ing to the Dataproc cluster nodes).

jean-philippe-martin commented 6 years ago

One difference (not sure if relevant) is that Files.exists (using CloudStorageFileSystems) fails with the new library when running on the local machine, but passes with the older version.

lukecwik commented 6 years ago

Can you dump the verbose version of your dependency tree?

jean-philippe-martin commented 6 years ago

Sure, @lukecwik, here is what I have. Apologies, it's long.

$ gradle dependencies --configuration compile

:dependencies

------------------------------------------------------------
Root project
------------------------------------------------------------

compile - Dependencies for source set 'main'.
+--- com.google.cloud:google-cloud-nio:0.23.1-alpha
|    +--- com.google.cloud:google-cloud-storage:1.5.1
|    |    +--- com.google.cloud:google-cloud-core:1.5.1
|    |    |    +--- com.google.guava:guava:20.0
|    |    |    +--- joda-time:joda-time:2.9.2
|    |    |    +--- org.json:json:20160810
|    |    |    +--- com.google.http-client:google-http-client:1.22.0
|    |    |    |    +--- com.google.code.findbugs:jsr305:1.3.9 -> 3.0.0
|    |    |    |    \--- org.apache.httpcomponents:httpclient:4.0.1
|    |    |    |         +--- org.apache.httpcomponents:httpcore:4.0.1
|    |    |    |         +--- commons-logging:commons-logging:1.1.1
|    |    |    |         \--- commons-codec:commons-codec:1.3 -> 1.10
|    |    |    +--- com.google.code.findbugs:jsr305:3.0.0
|    |    |    +--- com.google.api:api-common:1.1.0
|    |    |    |    +--- com.google.code.findbugs:jsr305:3.0.0
|    |    |    |    \--- com.google.guava:guava:19.0 -> 20.0
|    |    |    +--- com.google.api:gax:1.8.1
|    |    |    |    +--- com.google.auto.value:auto-value:1.2
|    |    |    |    +--- com.google.code.findbugs:jsr305:3.0.0
|    |    |    |    +--- com.google.guava:guava:20.0
|    |    |    |    +--- org.threeten:threetenbp:1.3.3
|    |    |    |    +--- com.google.auth:google-auth-library-oauth2-http:0.8.0
|    |    |    |    |    +--- com.google.auth:google-auth-library-credentials:0.8.0
|    |    |    |    |    +--- com.google.http-client:google-http-client:1.19.0 -> 1.22.0 (*)
|    |    |    |    |    +--- com.google.http-client:google-http-client-jackson2:1.19.0 -> 1.22.0
|    |    |    |    |    |    +--- com.google.http-client:google-http-client:1.22.0 (*)
|    |    |    |    |    |    \--- com.fasterxml.jackson.core:jackson-core:2.1.3 -> 2.6.5
|    |    |    |    |    \--- com.google.guava:guava:19.0 -> 20.0
|    |    |    |    \--- com.google.api:api-common:1.1.0 (*)
|    |    |    +--- com.google.protobuf:protobuf-java-util:3.3.1
|    |    |    |    +--- com.google.protobuf:protobuf-java:3.3.1
|    |    |    |    +--- com.google.guava:guava:19.0 -> 20.0
|    |    |    |    \--- com.google.code.gson:gson:2.7
|    |    |    +--- com.google.api.grpc:proto-google-common-protos:0.1.18
|    |    |    |    +--- com.google.protobuf:protobuf-java:3.3.0 -> 3.3.1
|    |    |    |    \--- com.google.api:api-common:1.1.0 (*)
|    |    |    \--- com.google.api.grpc:proto-google-iam-v1:0.1.18
|    |    |         +--- com.google.api.grpc:proto-google-common-protos:0.1.18 (*)
|    |    |         +--- com.google.protobuf:protobuf-java:3.3.0 -> 3.3.1
|    |    |         \--- com.google.api:api-common:1.1.0 (*)
|    |    +--- com.google.cloud:google-cloud-core-http:1.5.1
|    |    |    +--- com.google.cloud:google-cloud-core:1.5.1 (*)
|    |    |    +--- com.google.auth:google-auth-library-credentials:0.8.0
|    |    |    +--- com.google.auth:google-auth-library-oauth2-http:0.8.0 (*)
|    |    |    +--- com.google.http-client:google-http-client:1.22.0 (*)
|    |    |    +--- com.google.oauth-client:google-oauth-client:1.22.0
|    |    |    |    +--- com.google.http-client:google-http-client:1.22.0 (*)
|    |    |    |    \--- com.google.code.findbugs:jsr305:1.3.9 -> 3.0.0
|    |    |    +--- com.google.guava:guava:20.0
|    |    |    +--- com.google.api-client:google-api-client:1.22.0
|    |    |    |    +--- com.google.oauth-client:google-oauth-client:1.22.0 (*)
|    |    |    |    \--- com.google.http-client:google-http-client-jackson2:1.22.0 (*)
|    |    |    +--- com.google.http-client:google-http-client-appengine:1.22.0
|    |    |    |    \--- com.google.http-client:google-http-client:1.22.0 (*)
|    |    |    +--- com.google.http-client:google-http-client-jackson:1.22.0
|    |    |    |    +--- com.google.http-client:google-http-client:1.22.0 (*)
|    |    |    |    \--- org.codehaus.jackson:jackson-core-asl:1.9.11 -> 1.9.13
|    |    |    \--- com.google.http-client:google-http-client-jackson2:1.22.0 (*)
|    |    \--- com.google.apis:google-api-services-storage:v1-rev108-1.22.0
|    +--- com.google.guava:guava:20.0
|    +--- com.google.code.findbugs:jsr305:3.0.0
|    \--- javax.inject:javax.inject:1
+--- org.ojalgo:ojalgo:39.0
\--- org.apache.spark:spark-mllib_2.11:2.0.2
     +--- org.apache.spark:spark-core_2.11:2.0.2
     |    +--- org.apache.avro:avro-mapred:1.7.7
     |    |    +--- org.apache.avro:avro-ipc:1.7.7
     |    |    |    +--- org.apache.avro:avro:1.7.7
     |    |    |    |    +--- org.codehaus.jackson:jackson-core-asl:1.9.13
     |    |    |    |    +--- org.codehaus.jackson:jackson-mapper-asl:1.9.13
     |    |    |    |    |    \--- org.codehaus.jackson:jackson-core-asl:1.9.13
     |    |    |    |    +--- com.thoughtworks.paranamer:paranamer:2.3 -> 2.6
     |    |    |    |    +--- org.xerial.snappy:snappy-java:1.0.5 -> 1.1.2.6
     |    |    |    |    +--- org.apache.commons:commons-compress:1.4.1
     |    |    |    |    |    \--- org.tukaani:xz:1.0
     |    |    |    |    \--- org.slf4j:slf4j-api:1.6.4 -> 1.7.16
     |    |    |    +--- org.codehaus.jackson:jackson-core-asl:1.9.13
     |    |    |    +--- org.codehaus.jackson:jackson-mapper-asl:1.9.13 (*)
     |    |    |    \--- org.slf4j:slf4j-api:1.6.4 -> 1.7.16
     |    |    +--- org.codehaus.jackson:jackson-core-asl:1.9.13
     |    |    +--- org.codehaus.jackson:jackson-mapper-asl:1.9.13 (*)
     |    |    \--- org.slf4j:slf4j-api:1.6.4 -> 1.7.16
     |    +--- com.twitter:chill_2.11:0.8.0
     |    |    +--- org.scala-lang:scala-library:2.11.7 -> 2.11.8
     |    |    +--- com.twitter:chill-java:0.8.0
     |    |    |    \--- com.esotericsoftware:kryo-shaded:3.0.3
     |    |    |         +--- com.esotericsoftware:minlog:1.3.0
     |    |    |         \--- org.objenesis:objenesis:2.1
     |    |    \--- com.esotericsoftware:kryo-shaded:3.0.3 (*)
     |    +--- com.twitter:chill-java:0.8.0 (*)
     |    +--- org.apache.xbean:xbean-asm5-shaded:4.4
     |    +--- org.apache.hadoop:hadoop-client:2.2.0
     |    |    +--- org.apache.hadoop:hadoop-common:2.2.0
     |    |    |    +--- org.apache.hadoop:hadoop-annotations:2.2.0
     |    |    |    +--- com.google.guava:guava:11.0.2 -> 20.0
     |    |    |    +--- commons-cli:commons-cli:1.2
     |    |    |    +--- org.apache.commons:commons-math:2.1
     |    |    |    +--- xmlenc:xmlenc:0.52
     |    |    |    +--- commons-httpclient:commons-httpclient:3.1
     |    |    |    |    \--- commons-codec:commons-codec:1.2 -> 1.10
     |    |    |    +--- commons-codec:commons-codec:1.4 -> 1.10
     |    |    |    +--- commons-io:commons-io:2.1
     |    |    |    +--- commons-net:commons-net:3.1
     |    |    |    +--- log4j:log4j:1.2.17
     |    |    |    +--- commons-lang:commons-lang:2.5
     |    |    |    +--- commons-configuration:commons-configuration:1.6
     |    |    |    |    +--- commons-collections:commons-collections:3.2.1
     |    |    |    |    +--- commons-lang:commons-lang:2.4 -> 2.5
     |    |    |    |    +--- commons-digester:commons-digester:1.8
     |    |    |    |    |    \--- commons-beanutils:commons-beanutils:1.7.0
     |    |    |    |    \--- commons-beanutils:commons-beanutils-core:1.8.0
     |    |    |    +--- org.slf4j:slf4j-api:1.7.5 -> 1.7.16
     |    |    |    +--- org.codehaus.jackson:jackson-core-asl:1.8.8 -> 1.9.13
     |    |    |    +--- org.apache.avro:avro:1.7.4 -> 1.7.7 (*)
     |    |    |    +--- com.google.protobuf:protobuf-java:2.5.0 -> 3.3.1
     |    |    |    +--- org.apache.hadoop:hadoop-auth:2.2.0
     |    |    |    |    +--- org.slf4j:slf4j-api:1.7.5 -> 1.7.16
     |    |    |    |    +--- commons-codec:commons-codec:1.4 -> 1.10
     |    |    |    |    +--- log4j:log4j:1.2.17
     |    |    |    |    \--- org.slf4j:slf4j-log4j12:1.7.5 -> 1.7.16
     |    |    |    |         +--- org.slf4j:slf4j-api:1.7.16
     |    |    |    |         \--- log4j:log4j:1.2.17
     |    |    |    +--- org.apache.zookeeper:zookeeper:3.4.5
     |    |    |    |    +--- org.slf4j:slf4j-api:1.6.1 -> 1.7.16
     |    |    |    |    +--- org.slf4j:slf4j-log4j12:1.6.1 -> 1.7.16 (*)
     |    |    |    |    +--- log4j:log4j:1.2.15 -> 1.2.17
     |    |    |    |    \--- jline:jline:0.9.94
     |    |    |    +--- org.apache.commons:commons-compress:1.4.1 (*)
     |    |    |    \--- org.slf4j:slf4j-log4j12:1.7.5 -> 1.7.16 (*)
     |    |    +--- org.apache.hadoop:hadoop-hdfs:2.2.0
     |    |    |    +--- com.google.guava:guava:11.0.2 -> 20.0
     |    |    |    +--- org.mortbay.jetty:jetty-util:6.1.26
     |    |    |    +--- commons-cli:commons-cli:1.2
     |    |    |    +--- commons-codec:commons-codec:1.4 -> 1.10
     |    |    |    +--- commons-io:commons-io:2.1
     |    |    |    +--- commons-lang:commons-lang:2.5
     |    |    |    +--- log4j:log4j:1.2.17
     |    |    |    +--- com.google.protobuf:protobuf-java:2.5.0 -> 3.3.1
     |    |    |    +--- org.codehaus.jackson:jackson-core-asl:1.8.8 -> 1.9.13
     |    |    |    \--- xmlenc:xmlenc:0.52
     |    |    +--- org.apache.hadoop:hadoop-mapreduce-client-app:2.2.0
     |    |    |    +--- org.apache.hadoop:hadoop-mapreduce-client-common:2.2.0
     |    |    |    |    +--- org.apache.hadoop:hadoop-yarn-common:2.2.0
     |    |    |    |    |    +--- log4j:log4j:1.2.17
     |    |    |    |    |    +--- org.apache.hadoop:hadoop-yarn-api:2.2.0
     |    |    |    |    |    |    +--- org.slf4j:slf4j-api:1.7.5 -> 1.7.16
     |    |    |    |    |    |    +--- org.slf4j:slf4j-log4j12:1.7.5 -> 1.7.16 (*)
     |    |    |    |    |    |    +--- com.google.protobuf:protobuf-java:2.5.0 -> 3.3.1
     |    |    |    |    |    |    +--- commons-io:commons-io:2.1
     |    |    |    |    |    |    \--- com.google.inject:guice:3.0
     |    |    |    |    |    |         +--- javax.inject:javax.inject:1
     |    |    |    |    |    |         +--- aopalliance:aopalliance:1.0
     |    |    |    |    |    |         \--- org.sonatype.sisu.inject:cglib:2.2.1-v20090111
     |    |    |    |    |    +--- org.slf4j:slf4j-api:1.7.5 -> 1.7.16
     |    |    |    |    |    +--- org.slf4j:slf4j-log4j12:1.7.5 -> 1.7.16 (*)
     |    |    |    |    |    +--- com.google.protobuf:protobuf-java:2.5.0 -> 3.3.1
     |    |    |    |    |    +--- commons-io:commons-io:2.1
     |    |    |    |    |    \--- com.google.inject:guice:3.0 (*)
     |    |    |    |    +--- org.apache.hadoop:hadoop-yarn-client:2.2.0
     |    |    |    |    |    +--- org.apache.hadoop:hadoop-yarn-api:2.2.0 (*)
     |    |    |    |    |    +--- org.apache.hadoop:hadoop-yarn-common:2.2.0 (*)
     |    |    |    |    |    +--- org.slf4j:slf4j-api:1.7.5 -> 1.7.16
     |    |    |    |    |    +--- org.slf4j:slf4j-log4j12:1.7.5 -> 1.7.16 (*)
     |    |    |    |    |    +--- com.google.protobuf:protobuf-java:2.5.0 -> 3.3.1
     |    |    |    |    |    +--- commons-io:commons-io:2.1
     |    |    |    |    |    \--- com.google.inject:guice:3.0 (*)
     |    |    |    |    +--- org.apache.hadoop:hadoop-mapreduce-client-core:2.2.0
     |    |    |    |    |    +--- org.apache.hadoop:hadoop-yarn-common:2.2.0 (*)
     |    |    |    |    |    +--- com.google.protobuf:protobuf-java:2.5.0 -> 3.3.1
     |    |    |    |    |    +--- org.slf4j:slf4j-api:1.7.5 -> 1.7.16
     |    |    |    |    |    \--- org.slf4j:slf4j-log4j12:1.7.5 -> 1.7.16 (*)
     |    |    |    |    +--- org.apache.hadoop:hadoop-yarn-server-common:2.2.0
     |    |    |    |    |    +--- org.apache.hadoop:hadoop-yarn-common:2.2.0 (*)
     |    |    |    |    |    +--- org.apache.zookeeper:zookeeper:3.4.5 (*)
     |    |    |    |    |    +--- org.slf4j:slf4j-api:1.7.5 -> 1.7.16
     |    |    |    |    |    +--- org.slf4j:slf4j-log4j12:1.7.5 -> 1.7.16 (*)
     |    |    |    |    |    +--- com.google.protobuf:protobuf-java:2.5.0 -> 3.3.1
     |    |    |    |    |    +--- commons-io:commons-io:2.1
     |    |    |    |    |    \--- com.google.inject:guice:3.0 (*)
     |    |    |    |    +--- com.google.protobuf:protobuf-java:2.5.0 -> 3.3.1
     |    |    |    |    +--- org.slf4j:slf4j-api:1.7.5 -> 1.7.16
     |    |    |    |    \--- org.slf4j:slf4j-log4j12:1.7.5 -> 1.7.16 (*)
     |    |    |    +--- org.apache.hadoop:hadoop-mapreduce-client-shuffle:2.2.0
     |    |    |    |    +--- org.apache.hadoop:hadoop-yarn-server-nodemanager:2.2.0
     |    |    |    |    |    +--- org.apache.hadoop:hadoop-yarn-server-common:2.2.0 (*)
     |    |    |    |    |    +--- org.slf4j:slf4j-api:1.7.5 -> 1.7.16
     |    |    |    |    |    +--- org.slf4j:slf4j-log4j12:1.7.5 -> 1.7.16 (*)
     |    |    |    |    |    +--- com.google.protobuf:protobuf-java:2.5.0 -> 3.3.1
     |    |    |    |    |    +--- commons-io:commons-io:2.1
     |    |    |    |    |    \--- com.google.inject:guice:3.0 (*)
     |    |    |    |    +--- org.apache.hadoop:hadoop-mapreduce-client-core:2.2.0 (*)
     |    |    |    |    +--- com.google.protobuf:protobuf-java:2.5.0 -> 3.3.1
     |    |    |    |    +--- org.slf4j:slf4j-api:1.7.5 -> 1.7.16
     |    |    |    |    \--- org.slf4j:slf4j-log4j12:1.7.5 -> 1.7.16 (*)
     |    |    |    +--- com.google.protobuf:protobuf-java:2.5.0 -> 3.3.1
     |    |    |    +--- org.slf4j:slf4j-api:1.7.5 -> 1.7.16
     |    |    |    \--- org.slf4j:slf4j-log4j12:1.7.5 -> 1.7.16 (*)
     |    |    +--- org.apache.hadoop:hadoop-yarn-api:2.2.0 (*)
     |    |    +--- org.apache.hadoop:hadoop-mapreduce-client-core:2.2.0 (*)
     |    |    +--- org.apache.hadoop:hadoop-mapreduce-client-jobclient:2.2.0
     |    |    |    +--- org.apache.hadoop:hadoop-mapreduce-client-common:2.2.0 (*)
     |    |    |    +--- org.apache.hadoop:hadoop-mapreduce-client-shuffle:2.2.0 (*)
     |    |    |    +--- com.google.protobuf:protobuf-java:2.5.0 -> 3.3.1
     |    |    |    +--- org.slf4j:slf4j-api:1.7.5 -> 1.7.16
     |    |    |    \--- org.slf4j:slf4j-log4j12:1.7.5 -> 1.7.16 (*)
     |    |    \--- org.apache.hadoop:hadoop-annotations:2.2.0
     |    +--- org.apache.spark:spark-launcher_2.11:2.0.2
     |    |    +--- org.apache.spark:spark-tags_2.11:2.0.2
     |    |    |    +--- org.scalatest:scalatest_2.11:2.2.6
     |    |    |    |    +--- org.scala-lang:scala-library:2.11.7 -> 2.11.8
     |    |    |    |    +--- org.scala-lang:scala-reflect:2.11.7 -> 2.11.8
     |    |    |    |    |    \--- org.scala-lang:scala-library:2.11.8
     |    |    |    |    \--- org.scala-lang.modules:scala-xml_2.11:1.0.2
     |    |    |    |         \--- org.scala-lang:scala-library:2.11.1 -> 2.11.8
     |    |    |    \--- org.spark-project.spark:unused:1.0.0
     |    |    \--- org.spark-project.spark:unused:1.0.0
     |    +--- org.apache.spark:spark-network-common_2.11:2.0.2
     |    |    +--- io.netty:netty-all:4.0.29.Final
     |    |    +--- org.fusesource.leveldbjni:leveldbjni-all:1.8
     |    |    +--- com.fasterxml.jackson.core:jackson-databind:2.6.5
     |    |    |    +--- com.fasterxml.jackson.core:jackson-annotations:2.6.0 -> 2.6.5
     |    |    |    \--- com.fasterxml.jackson.core:jackson-core:2.6.5
     |    |    +--- com.fasterxml.jackson.core:jackson-annotations:2.6.5
     |    |    +--- com.google.code.findbugs:jsr305:1.3.9 -> 3.0.0
     |    |    +--- org.apache.spark:spark-tags_2.11:2.0.2 (*)
     |    |    \--- org.spark-project.spark:unused:1.0.0
     |    +--- org.apache.spark:spark-network-shuffle_2.11:2.0.2
     |    |    +--- org.apache.spark:spark-network-common_2.11:2.0.2 (*)
     |    |    +--- org.apache.spark:spark-tags_2.11:2.0.2 (*)
     |    |    \--- org.spark-project.spark:unused:1.0.0
     |    +--- org.apache.spark:spark-unsafe_2.11:2.0.2
     |    |    +--- org.apache.spark:spark-tags_2.11:2.0.2 (*)
     |    |    +--- com.twitter:chill_2.11:0.8.0 (*)
     |    |    +--- com.google.code.findbugs:jsr305:1.3.9 -> 3.0.0
     |    |    \--- org.spark-project.spark:unused:1.0.0
     |    +--- net.java.dev.jets3t:jets3t:0.7.1
     |    |    +--- commons-codec:commons-codec:1.3 -> 1.10
     |    |    \--- commons-httpclient:commons-httpclient:3.1 (*)
     |    +--- org.apache.curator:curator-recipes:2.4.0
     |    |    +--- org.apache.curator:curator-framework:2.4.0
     |    |    |    +--- org.apache.curator:curator-client:2.4.0
     |    |    |    |    +--- org.slf4j:slf4j-api:1.6.4 -> 1.7.16
     |    |    |    |    +--- org.apache.zookeeper:zookeeper:3.4.5 (*)
     |    |    |    |    \--- com.google.guava:guava:14.0.1 -> 20.0
     |    |    |    +--- org.apache.zookeeper:zookeeper:3.4.5 (*)
     |    |    |    \--- com.google.guava:guava:14.0.1 -> 20.0
     |    |    +--- org.apache.zookeeper:zookeeper:3.4.5 (*)
     |    |    \--- com.google.guava:guava:14.0.1 -> 20.0
     |    +--- javax.servlet:javax.servlet-api:3.1.0
     |    +--- org.apache.commons:commons-lang3:3.3.2
     |    +--- org.apache.commons:commons-math3:3.4.1
     |    +--- com.google.code.findbugs:jsr305:1.3.9 -> 3.0.0
     |    +--- org.slf4j:slf4j-api:1.7.16
     |    +--- org.slf4j:jcl-over-slf4j:1.7.16
     |    |    \--- org.slf4j:slf4j-api:1.7.16
     |    +--- log4j:log4j:1.2.17
     |    +--- org.slf4j:slf4j-log4j12:1.7.16 (*)
     |    +--- com.ning:compress-lzf:1.0.3
     |    +--- org.xerial.snappy:snappy-java:1.1.2.6
     |    +--- net.jpountz.lz4:lz4:1.3.0
     |    +--- org.roaringbitmap:RoaringBitmap:0.5.11
     |    +--- commons-net:commons-net:2.2 -> 3.1
     |    +--- org.scala-lang:scala-library:2.11.8
     |    +--- org.json4s:json4s-jackson_2.11:3.2.11
     |    |    +--- org.scala-lang:scala-library:2.11.0 -> 2.11.8
     |    |    +--- org.json4s:json4s-core_2.11:3.2.11
     |    |    |    +--- org.scala-lang:scala-library:2.11.0 -> 2.11.8
     |    |    |    +--- org.json4s:json4s-ast_2.11:3.2.11
     |    |    |    |    \--- org.scala-lang:scala-library:2.11.0 -> 2.11.8
     |    |    |    +--- com.thoughtworks.paranamer:paranamer:2.6
     |    |    |    \--- org.scala-lang:scalap:2.11.0
     |    |    |         \--- org.scala-lang:scala-compiler:2.11.0
     |    |    |              +--- org.scala-lang:scala-library:2.11.0 -> 2.11.8
     |    |    |              +--- org.scala-lang:scala-reflect:2.11.0 -> 2.11.8 (*)
     |    |    |              +--- org.scala-lang.modules:scala-xml_2.11:1.0.1 -> 1.0.2 (*)
     |    |    |              \--- org.scala-lang.modules:scala-parser-combinators_2.11:1.0.1
     |    |    |                   \--- org.scala-lang:scala-library:2.11.0 -> 2.11.8
     |    |    \--- com.fasterxml.jackson.core:jackson-databind:2.3.1 -> 2.6.5 (*)
     |    +--- org.glassfish.jersey.core:jersey-client:2.22.2
     |    |    +--- javax.ws.rs:javax.ws.rs-api:2.0.1
     |    |    +--- org.glassfish.jersey.core:jersey-common:2.22.2
     |    |    |    +--- javax.ws.rs:javax.ws.rs-api:2.0.1
     |    |    |    +--- javax.annotation:javax.annotation-api:1.2
     |    |    |    +--- org.glassfish.jersey.bundles.repackaged:jersey-guava:2.22.2
     |    |    |    +--- org.glassfish.hk2:hk2-api:2.4.0-b34
     |    |    |    |    +--- javax.inject:javax.inject:1
     |    |    |    |    +--- org.glassfish.hk2:hk2-utils:2.4.0-b34
     |    |    |    |    |    \--- javax.inject:javax.inject:1
     |    |    |    |    \--- org.glassfish.hk2.external:aopalliance-repackaged:2.4.0-b34
     |    |    |    +--- org.glassfish.hk2.external:javax.inject:2.4.0-b34
     |    |    |    +--- org.glassfish.hk2:hk2-locator:2.4.0-b34
     |    |    |    |    +--- org.glassfish.hk2.external:javax.inject:2.4.0-b34
     |    |    |    |    +--- org.glassfish.hk2.external:aopalliance-repackaged:2.4.0-b34
     |    |    |    |    +--- org.glassfish.hk2:hk2-api:2.4.0-b34 (*)
     |    |    |    |    +--- org.glassfish.hk2:hk2-utils:2.4.0-b34 (*)
     |    |    |    |    \--- org.javassist:javassist:3.18.1-GA
     |    |    |    \--- org.glassfish.hk2:osgi-resource-locator:1.0.1
     |    |    +--- org.glassfish.hk2:hk2-api:2.4.0-b34 (*)
     |    |    +--- org.glassfish.hk2.external:javax.inject:2.4.0-b34
     |    |    \--- org.glassfish.hk2:hk2-locator:2.4.0-b34 (*)
     |    +--- org.glassfish.jersey.core:jersey-common:2.22.2 (*)
     |    +--- org.glassfish.jersey.core:jersey-server:2.22.2
     |    |    +--- org.glassfish.jersey.core:jersey-common:2.22.2 (*)
     |    |    +--- org.glassfish.jersey.core:jersey-client:2.22.2 (*)
     |    |    +--- javax.ws.rs:javax.ws.rs-api:2.0.1
     |    |    +--- org.glassfish.jersey.media:jersey-media-jaxb:2.22.2
     |    |    |    +--- org.glassfish.jersey.core:jersey-common:2.22.2 (*)
     |    |    |    +--- org.glassfish.hk2:hk2-api:2.4.0-b34 (*)
     |    |    |    +--- org.glassfish.hk2.external:javax.inject:2.4.0-b34
     |    |    |    +--- org.glassfish.hk2:hk2-locator:2.4.0-b34 (*)
     |    |    |    \--- org.glassfish.hk2:osgi-resource-locator:1.0.1
     |    |    +--- javax.annotation:javax.annotation-api:1.2
     |    |    +--- org.glassfish.hk2:hk2-api:2.4.0-b34 (*)
     |    |    +--- org.glassfish.hk2.external:javax.inject:2.4.0-b34
     |    |    +--- org.glassfish.hk2:hk2-locator:2.4.0-b34 (*)
     |    |    \--- javax.validation:validation-api:1.1.0.Final
     |    +--- org.glassfish.jersey.containers:jersey-container-servlet:2.22.2
     |    |    +--- org.glassfish.jersey.containers:jersey-container-servlet-core:2.22.2
     |    |    |    +--- org.glassfish.hk2.external:javax.inject:2.4.0-b34
     |    |    |    +--- org.glassfish.jersey.core:jersey-common:2.22.2 (*)
     |    |    |    +--- org.glassfish.jersey.core:jersey-server:2.22.2 (*)
     |    |    |    \--- javax.ws.rs:javax.ws.rs-api:2.0.1
     |    |    +--- org.glassfish.jersey.core:jersey-common:2.22.2 (*)
     |    |    +--- org.glassfish.jersey.core:jersey-server:2.22.2 (*)
     |    |    \--- javax.ws.rs:javax.ws.rs-api:2.0.1
     |    +--- org.glassfish.jersey.containers:jersey-container-servlet-core:2.22.2 (*)
     |    +--- org.apache.mesos:mesos:0.21.1
     |    +--- io.netty:netty-all:4.0.29.Final
     |    +--- io.netty:netty:3.8.0.Final
     |    +--- com.clearspring.analytics:stream:2.7.0
     |    +--- io.dropwizard.metrics:metrics-core:3.1.2
     |    |    \--- org.slf4j:slf4j-api:1.7.7 -> 1.7.16
     |    +--- io.dropwizard.metrics:metrics-jvm:3.1.2
     |    |    +--- io.dropwizard.metrics:metrics-core:3.1.2 (*)
     |    |    \--- org.slf4j:slf4j-api:1.7.7 -> 1.7.16
     |    +--- io.dropwizard.metrics:metrics-json:3.1.2
     |    |    +--- io.dropwizard.metrics:metrics-core:3.1.2 (*)
     |    |    +--- com.fasterxml.jackson.core:jackson-databind:2.4.2 -> 2.6.5 (*)
     |    |    \--- org.slf4j:slf4j-api:1.7.7 -> 1.7.16
     |    +--- io.dropwizard.metrics:metrics-graphite:3.1.2
     |    |    +--- io.dropwizard.metrics:metrics-core:3.1.2 (*)
     |    |    \--- org.slf4j:slf4j-api:1.7.7 -> 1.7.16
     |    +--- com.fasterxml.jackson.core:jackson-databind:2.6.5 (*)
     |    +--- com.fasterxml.jackson.module:jackson-module-scala_2.11:2.6.5
     |    |    +--- org.scala-lang:scala-library:2.11.7 -> 2.11.8
     |    |    +--- org.scala-lang:scala-reflect:2.11.7 -> 2.11.8 (*)
     |    |    +--- com.fasterxml.jackson.core:jackson-core:2.6.5
     |    |    +--- com.fasterxml.jackson.core:jackson-annotations:2.6.5
     |    |    +--- com.fasterxml.jackson.core:jackson-databind:2.6.5 (*)
     |    |    \--- com.fasterxml.jackson.module:jackson-module-paranamer:2.6.5
     |    |         +--- com.fasterxml.jackson.core:jackson-databind:2.6.5 (*)
     |    |         \--- com.thoughtworks.paranamer:paranamer:2.6
     |    +--- org.apache.ivy:ivy:2.4.0
     |    +--- oro:oro:2.0.8
     |    +--- net.razorvine:pyrolite:4.13
     |    +--- net.sf.py4j:py4j:0.10.3
     |    +--- org.apache.spark:spark-tags_2.11:2.0.2 (*)
     |    \--- org.spark-project.spark:unused:1.0.0
     +--- org.apache.spark:spark-streaming_2.11:2.0.2
     |    +--- org.apache.spark:spark-core_2.11:2.0.2 (*)
     |    +--- org.apache.spark:spark-tags_2.11:2.0.2 (*)
     |    +--- org.scala-lang:scala-library:2.11.8
     |    \--- org.spark-project.spark:unused:1.0.0
     +--- org.apache.spark:spark-sql_2.11:2.0.2
     |    +--- com.univocity:univocity-parsers:2.1.1
     |    +--- org.apache.spark:spark-sketch_2.11:2.0.2
     |    |    +--- org.apache.spark:spark-tags_2.11:2.0.2 (*)
     |    |    \--- org.spark-project.spark:unused:1.0.0
     |    +--- org.apache.spark:spark-core_2.11:2.0.2 (*)
     |    +--- org.apache.spark:spark-catalyst_2.11:2.0.2
     |    |    +--- org.scala-lang:scala-reflect:2.11.8 (*)
     |    |    +--- org.apache.spark:spark-core_2.11:2.0.2 (*)
     |    |    +--- org.apache.spark:spark-tags_2.11:2.0.2 (*)
     |    |    +--- org.apache.spark:spark-unsafe_2.11:2.0.2 (*)
     |    |    +--- org.codehaus.janino:janino:2.7.8
     |    |    |    \--- org.codehaus.janino:commons-compiler:2.7.8
     |    |    +--- org.antlr:antlr4-runtime:4.5.3
     |    |    +--- commons-codec:commons-codec:1.10
     |    |    \--- org.spark-project.spark:unused:1.0.0
     |    +--- org.apache.spark:spark-tags_2.11:2.0.2 (*)
     |    +--- org.apache.parquet:parquet-column:1.7.0
     |    |    +--- org.apache.parquet:parquet-common:1.7.0
     |    |    +--- org.apache.parquet:parquet-encoding:1.7.0
     |    |    |    +--- org.apache.parquet:parquet-common:1.7.0
     |    |    |    +--- org.apache.parquet:parquet-generator:1.7.0
     |    |    |    |    \--- org.apache.parquet:parquet-common:1.7.0
     |    |    |    \--- commons-codec:commons-codec:1.5 -> 1.10
     |    |    \--- commons-codec:commons-codec:1.5 -> 1.10
     |    +--- org.apache.parquet:parquet-hadoop:1.7.0
     |    |    +--- org.apache.parquet:parquet-column:1.7.0 (*)
     |    |    +--- org.apache.parquet:parquet-format:2.3.0-incubating
     |    |    +--- org.apache.parquet:parquet-jackson:1.7.0
     |    |    +--- org.codehaus.jackson:jackson-mapper-asl:1.9.11 -> 1.9.13 (*)
     |    |    +--- org.codehaus.jackson:jackson-core-asl:1.9.11 -> 1.9.13
     |    |    \--- org.xerial.snappy:snappy-java:1.1.1.6 -> 1.1.2.6
     |    +--- com.fasterxml.jackson.core:jackson-databind:2.6.5 (*)
     |    \--- org.spark-project.spark:unused:1.0.0
     +--- org.apache.spark:spark-graphx_2.11:2.0.2
     |    +--- org.apache.spark:spark-core_2.11:2.0.2 (*)
     |    +--- org.apache.xbean:xbean-asm5-shaded:4.4
     |    +--- com.github.fommil.netlib:core:1.1.2
     |    |    \--- net.sourceforge.f2j:arpack_combined_all:0.1
     |    +--- net.sourceforge.f2j:arpack_combined_all:0.1
     |    +--- org.apache.spark:spark-tags_2.11:2.0.2 (*)
     |    \--- org.spark-project.spark:unused:1.0.0
     +--- org.apache.spark:spark-mllib-local_2.11:2.0.2
     |    +--- org.scalanlp:breeze_2.11:0.11.2
     |    |    +--- org.scala-lang:scala-library:2.11.4 -> 2.11.8
     |    |    +--- org.scalanlp:breeze-macros_2.11:0.11.2
     |    |    |    +--- org.scala-lang:scala-library:2.11.4 -> 2.11.8
     |    |    |    \--- org.scala-lang:scala-reflect:2.11.4 -> 2.11.8 (*)
     |    |    +--- com.github.fommil.netlib:core:1.1.2 (*)
     |    |    +--- net.sourceforge.f2j:arpack_combined_all:0.1
     |    |    +--- net.sf.opencsv:opencsv:2.3
     |    |    +--- com.github.rwl:jtransforms:2.4.0
     |    |    +--- org.spire-math:spire_2.11:0.7.4
     |    |    |    +--- org.scala-lang:scala-library:2.11.0 -> 2.11.8
     |    |    |    +--- org.spire-math:spire-macros_2.11:0.7.4
     |    |    |    |    +--- org.scala-lang:scala-library:2.11.0 -> 2.11.8
     |    |    |    |    \--- org.scala-lang:scala-reflect:2.11.0 -> 2.11.8 (*)
     |    |    |    \--- org.scala-lang:scala-reflect:2.11.0 -> 2.11.8 (*)
     |    |    \--- org.slf4j:slf4j-api:1.7.5 -> 1.7.16
     |    +--- org.apache.commons:commons-math3:3.4.1
     |    +--- org.apache.spark:spark-tags_2.11:2.0.2 (*)
     |    \--- org.spark-project.spark:unused:1.0.0
     +--- org.scalanlp:breeze_2.11:0.11.2 (*)
     +--- org.apache.commons:commons-math3:3.4.1
     +--- org.jpmml:pmml-model:1.2.15
     |    \--- org.jpmml:pmml-schema:1.2.15
     +--- org.apache.spark:spark-tags_2.11:2.0.2 (*)
     \--- org.spark-project.spark:unused:1.0.0

(*) - dependencies omitted (listed previously)

BUILD SUCCESSFUL

Total time: 5.179 secs
btilford commented 6 years ago

I think I ran into this as well after updating and adding some dependencies. No problems before doing that.

DIFF

diff --git a/pom.xml b/pom.xml
index c2fc611..43cc0ee 100644
--- a/pom.xml
+++ b/pom.xml
@@ -19,7 +19,7 @@
         <maven.compiler.source>1.8</maven.compiler.source>
         <maven.compiler.target>1.8</maven.compiler.target>

-        <google-api-client.version>1.22.0</google-api-client.version>
+        <google-api-client.version>1.23.0</google-api-client.version>
     </properties>

     <dependencyManagement>
@@ -103,6 +103,11 @@
         </dependencies>
     </dependencyManagement>
     <dependencies>
+        <!--<dependency>
+            <groupId>org.mongodb.scala</groupId>
+            <artifactId>mongo-scala-driver</artifactId>
+            <version>2.1.0</version>
+        </dependency>-->
         <dependency>
             <groupId>com.google.guava</groupId>
             <artifactId>guava</artifactId>
@@ -133,11 +138,11 @@
             <artifactId>scala-library</artifactId>
             <version>2.11.7</version>
         </dependency>
-<!--        <dependency>
+        <dependency>
             <groupId>org.apache.spark</groupId>
-            <artifactId>spark-mlib_2.11</artifactId>
+            <artifactId>spark-mllib_2.11</artifactId>
             <version>${spark.version}</version>
-        </dependency>-->
+        </dependency>
         <dependency>
             <groupId>org.apache.spark</groupId>
             <artifactId>spark-yarn_2.11</artifactId>
@@ -196,11 +201,11 @@
                 </exclusion>
             </exclusions>-->
         </dependency>
-        <!--<dependency>
+        <dependency>
             <groupId>com.google.apis</groupId>
             <artifactId>google-api-services-analyticsreporting</artifactId>
             <version>v4-rev116-1.23.0</version>
-        </dependency>-->
+        </dependency>
         <dependency>
             <groupId>com.google.api-client</groupId>
             <artifactId>google-api-client</artifactId>

Dependency Tree

[INFO] com.xyz.my-project:jar:1.0.0-SNAPSHOT
[INFO] +- com.google.guava:guava:jar:18.0:compile
[INFO] +- com.spotify:spark-bigquery_2.11:jar:0.2.0:compile
[INFO] |  +- com.databricks:spark-avro_2.11:jar:3.0.0:compile
[INFO] |  +- org.slf4j:slf4j-simple:jar:1.7.21:compile
[INFO] |  \- joda-time:joda-time:jar:2.9.3:compile
[INFO] +- co.cask.hydrator:hbase-plugins:jar:1.8.1:compile
[INFO] |  +- co.cask.cdap:cdap-formats:jar:4.3.1:compile
[INFO] |  |  +- co.cask.cdap:cdap-spi:jar:4.3.1:compile
[INFO] |  |  |  \- co.cask.cdap:cdap-api-common:jar:4.3.1:compile
[INFO] |  |  \- io.thekraken:grok:jar:0.1.0:compile
[INFO] |  |     \- com.github.tony19:named-regexp:jar:0.2.3:compile
[INFO] |  +- co.cask.hydrator:hydrator-common:jar:1.8.1:compile
[INFO] |  +- org.apache.zookeeper:zookeeper:jar:3.4.5:compile
[INFO] |  \- org.apache.hbase:hbase-server:jar:0.98.6.1-hadoop2:compile
[INFO] |     +- org.apache.hbase:hbase-common:jar:0.98.6.1-hadoop2:compile
[INFO] |     +- org.apache.hbase:hbase-protocol:jar:0.98.6.1-hadoop2:compile
[INFO] |     +- org.apache.hbase:hbase-client:jar:0.98.6.1-hadoop2:compile
[INFO] |     +- org.apache.hbase:hbase-prefix-tree:jar:0.98.6.1-hadoop2:runtime
[INFO] |     |  \- org.apache.hbase:hbase-common:jar:tests:0.98.6.1-hadoop2:runtime
[INFO] |     +- commons-httpclient:commons-httpclient:jar:3.1:compile
[INFO] |     +- commons-collections:commons-collections:jar:3.2.1:compile
[INFO] |     +- com.yammer.metrics:metrics-core:jar:2.2.0:compile
[INFO] |     +- commons-cli:commons-cli:jar:1.2:compile
[INFO] |     +- com.github.stephenc.high-scale-lib:high-scale-lib:jar:1.1.1:compile
[INFO] |     +- commons-io:commons-io:jar:2.4:compile
[INFO] |     +- commons-lang:commons-lang:jar:2.6:compile
[INFO] |     +- commons-logging:commons-logging:jar:1.1.1:compile
[INFO] |     +- org.apache.commons:commons-math:jar:2.1:compile
[INFO] |     +- log4j:log4j:jar:1.2.17:compile
[INFO] |     +- org.mortbay.jetty:jetty:jar:6.1.26:compile
[INFO] |     +- org.mortbay.jetty:jetty-util:jar:6.1.26:compile
[INFO] |     +- org.mortbay.jetty:jetty-sslengine:jar:6.1.26:compile
[INFO] |     +- org.mortbay.jetty:jsp-2.1:jar:6.1.14:compile
[INFO] |     +- org.mortbay.jetty:jsp-api-2.1:jar:6.1.14:compile
[INFO] |     +- org.mortbay.jetty:servlet-api-2.5:jar:6.1.14:compile
[INFO] |     +- org.codehaus.jackson:jackson-core-asl:jar:1.8.8:compile
[INFO] |     +- org.codehaus.jackson:jackson-mapper-asl:jar:1.8.8:compile
[INFO] |     +- org.codehaus.jackson:jackson-jaxrs:jar:1.8.8:compile
[INFO] |     +- tomcat:jasper-compiler:jar:5.5.23:compile
[INFO] |     +- tomcat:jasper-runtime:jar:5.5.23:runtime
[INFO] |     |  \- commons-el:commons-el:jar:1.0:runtime
[INFO] |     +- org.jamon:jamon-runtime:jar:2.3.1:compile
[INFO] |     +- com.google.protobuf:protobuf-java:jar:2.5.0:compile
[INFO] |     +- com.sun.jersey:jersey-core:jar:1.8:compile
[INFO] |     +- com.sun.jersey:jersey-json:jar:1.8:compile
[INFO] |     |  +- org.codehaus.jettison:jettison:jar:1.1:compile
[INFO] |     |  \- com.sun.xml.bind:jaxb-impl:jar:2.2.3-1:compile
[INFO] |     +- com.sun.jersey:jersey-server:jar:1.8:compile
[INFO] |     |  \- asm:asm:jar:3.1:compile
[INFO] |     +- javax.xml.bind:jaxb-api:jar:2.2.2:compile
[INFO] |     |  \- javax.activation:activation:jar:1.1:compile
[INFO] |     +- org.cloudera.htrace:htrace-core:jar:2.04:compile
[INFO] |     +- com.github.stephenc.findbugs:findbugs-annotations:jar:1.3.9-1:compile
[INFO] |     \- junit:junit:jar:4.11:compile
[INFO] |        \- org.hamcrest:hamcrest-core:jar:1.3:compile
[INFO] +- org.apache.spark:spark-sql_2.11:jar:2.2.0:compile
[INFO] |  +- com.univocity:univocity-parsers:jar:2.2.1:compile
[INFO] |  +- org.apache.spark:spark-sketch_2.11:jar:2.2.0:compile
[INFO] |  +- org.apache.spark:spark-core_2.11:jar:2.2.0:compile
[INFO] |  |  +- com.twitter:chill_2.11:jar:0.8.0:compile
[INFO] |  |  |  \- com.esotericsoftware:kryo-shaded:jar:3.0.3:compile
[INFO] |  |  |     +- com.esotericsoftware:minlog:jar:1.3.0:compile
[INFO] |  |  |     \- org.objenesis:objenesis:jar:2.1:compile
[INFO] |  |  +- com.twitter:chill-java:jar:0.8.0:compile
[INFO] |  |  +- org.apache.spark:spark-launcher_2.11:jar:2.2.0:compile
[INFO] |  |  +- org.apache.spark:spark-network-common_2.11:jar:2.2.0:compile
[INFO] |  |  |  \- org.fusesource.leveldbjni:leveldbjni-all:jar:1.8:compile
[INFO] |  |  +- org.apache.spark:spark-network-shuffle_2.11:jar:2.2.0:compile
[INFO] |  |  +- org.apache.spark:spark-unsafe_2.11:jar:2.2.0:compile
[INFO] |  |  +- net.java.dev.jets3t:jets3t:jar:0.9.3:compile
[INFO] |  |  |  +- org.apache.httpcomponents:httpcore:jar:4.3.3:compile
[INFO] |  |  |  +- org.apache.httpcomponents:httpclient:jar:4.3.6:compile
[INFO] |  |  |  +- mx4j:mx4j:jar:3.0.2:compile
[INFO] |  |  |  +- javax.mail:mail:jar:1.4.7:compile
[INFO] |  |  |  +- org.bouncycastle:bcprov-jdk15on:jar:1.51:compile
[INFO] |  |  |  \- com.jamesmurty.utils:java-xmlbuilder:jar:1.0:compile
[INFO] |  |  |     \- net.iharder:base64:jar:2.3.8:compile
[INFO] |  |  +- org.apache.curator:curator-recipes:jar:2.6.0:compile
[INFO] |  |  |  \- org.apache.curator:curator-framework:jar:2.6.0:compile
[INFO] |  |  +- javax.servlet:javax.servlet-api:jar:3.1.0:compile
[INFO] |  |  +- org.apache.commons:commons-lang3:jar:3.5:compile
[INFO] |  |  +- org.slf4j:jul-to-slf4j:jar:1.7.16:compile
[INFO] |  |  +- org.slf4j:jcl-over-slf4j:jar:1.7.16:compile
[INFO] |  |  +- org.slf4j:slf4j-log4j12:jar:1.7.16:compile
[INFO] |  |  +- com.ning:compress-lzf:jar:1.0.3:compile
[INFO] |  |  +- org.xerial.snappy:snappy-java:jar:1.1.2.6:compile
[INFO] |  |  +- net.jpountz.lz4:lz4:jar:1.3.0:compile
[INFO] |  |  +- org.roaringbitmap:RoaringBitmap:jar:0.5.11:compile
[INFO] |  |  +- commons-net:commons-net:jar:2.2:compile
[INFO] |  |  +- org.json4s:json4s-jackson_2.11:jar:3.2.11:compile
[INFO] |  |  |  \- org.json4s:json4s-core_2.11:jar:3.2.11:compile
[INFO] |  |  |     +- org.json4s:json4s-ast_2.11:jar:3.2.11:compile
[INFO] |  |  |     \- org.scala-lang:scalap:jar:2.11.0:compile
[INFO] |  |  |        \- org.scala-lang:scala-compiler:jar:2.11.0:compile
[INFO] |  |  |           +- org.scala-lang.modules:scala-xml_2.11:jar:1.0.1:compile
[INFO] |  |  |           \- org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.1:compile
[INFO] |  |  +- org.glassfish.jersey.core:jersey-client:jar:2.22.2:compile
[INFO] |  |  |  +- javax.ws.rs:javax.ws.rs-api:jar:2.0.1:compile
[INFO] |  |  |  +- org.glassfish.hk2:hk2-api:jar:2.4.0-b34:compile
[INFO] |  |  |  |  +- org.glassfish.hk2:hk2-utils:jar:2.4.0-b34:compile
[INFO] |  |  |  |  \- org.glassfish.hk2.external:aopalliance-repackaged:jar:2.4.0-b34:compile
[INFO] |  |  |  +- org.glassfish.hk2.external:javax.inject:jar:2.4.0-b34:compile
[INFO] |  |  |  \- org.glassfish.hk2:hk2-locator:jar:2.4.0-b34:compile
[INFO] |  |  |     \- org.javassist:javassist:jar:3.18.1-GA:compile
[INFO] |  |  +- org.glassfish.jersey.core:jersey-common:jar:2.22.2:compile
[INFO] |  |  |  +- javax.annotation:javax.annotation-api:jar:1.2:compile
[INFO] |  |  |  +- org.glassfish.jersey.bundles.repackaged:jersey-guava:jar:2.22.2:compile
[INFO] |  |  |  \- org.glassfish.hk2:osgi-resource-locator:jar:1.0.1:compile
[INFO] |  |  +- org.glassfish.jersey.core:jersey-server:jar:2.22.2:compile
[INFO] |  |  |  +- org.glassfish.jersey.media:jersey-media-jaxb:jar:2.22.2:compile
[INFO] |  |  |  \- javax.validation:validation-api:jar:1.1.0.Final:compile
[INFO] |  |  +- org.glassfish.jersey.containers:jersey-container-servlet:jar:2.22.2:compile
[INFO] |  |  +- org.glassfish.jersey.containers:jersey-container-servlet-core:jar:2.22.2:compile
[INFO] |  |  +- io.netty:netty-all:jar:4.0.43.Final:compile
[INFO] |  |  +- io.netty:netty:jar:3.9.9.Final:compile
[INFO] |  |  +- com.clearspring.analytics:stream:jar:2.7.0:compile
[INFO] |  |  +- io.dropwizard.metrics:metrics-core:jar:3.1.2:compile
[INFO] |  |  +- io.dropwizard.metrics:metrics-jvm:jar:3.1.2:compile
[INFO] |  |  +- io.dropwizard.metrics:metrics-json:jar:3.1.2:compile
[INFO] |  |  +- io.dropwizard.metrics:metrics-graphite:jar:3.1.2:compile
[INFO] |  |  +- com.fasterxml.jackson.module:jackson-module-scala_2.11:jar:2.6.5:compile
[INFO] |  |  |  \- com.fasterxml.jackson.module:jackson-module-paranamer:jar:2.6.5:compile
[INFO] |  |  +- org.apache.ivy:ivy:jar:2.4.0:compile
[INFO] |  |  +- oro:oro:jar:2.0.8:compile
[INFO] |  |  +- net.razorvine:pyrolite:jar:4.13:compile
[INFO] |  |  +- net.sf.py4j:py4j:jar:0.10.4:compile
[INFO] |  |  \- org.apache.commons:commons-crypto:jar:1.0.0:compile
[INFO] |  +- org.apache.spark:spark-catalyst_2.11:jar:2.2.0:compile
[INFO] |  |  +- org.scala-lang:scala-reflect:jar:2.11.8:compile
[INFO] |  |  +- org.codehaus.janino:janino:jar:3.0.0:compile
[INFO] |  |  +- org.codehaus.janino:commons-compiler:jar:3.0.0:compile
[INFO] |  |  +- org.antlr:antlr4-runtime:jar:4.5.3:compile
[INFO] |  |  \- commons-codec:commons-codec:jar:1.10:compile
[INFO] |  +- org.apache.spark:spark-tags_2.11:jar:2.2.0:compile
[INFO] |  +- org.apache.parquet:parquet-column:jar:1.8.2:compile
[INFO] |  |  +- org.apache.parquet:parquet-common:jar:1.8.2:compile
[INFO] |  |  \- org.apache.parquet:parquet-encoding:jar:1.8.2:compile
[INFO] |  +- org.apache.parquet:parquet-hadoop:jar:1.8.2:compile
[INFO] |  |  +- org.apache.parquet:parquet-format:jar:2.3.1:compile
[INFO] |  |  \- org.apache.parquet:parquet-jackson:jar:1.8.2:compile
[INFO] |  +- com.fasterxml.jackson.core:jackson-databind:jar:2.6.5:compile
[INFO] |  |  +- com.fasterxml.jackson.core:jackson-annotations:jar:2.6.0:compile
[INFO] |  |  \- com.fasterxml.jackson.core:jackson-core:jar:2.6.5:compile
[INFO] |  +- org.apache.xbean:xbean-asm5-shaded:jar:4.4:compile
[INFO] |  \- org.spark-project.spark:unused:jar:1.0.0:compile
[INFO] +- org.scala-lang:scala-library:jar:2.11.7:compile
[INFO] +- org.apache.spark:spark-mllib_2.11:jar:2.2.0:compile
[INFO] |  +- org.apache.spark:spark-graphx_2.11:jar:2.2.0:compile
[INFO] |  |  +- com.github.fommil.netlib:core:jar:1.1.2:compile
[INFO] |  |  \- net.sourceforge.f2j:arpack_combined_all:jar:0.1:compile
[INFO] |  +- org.apache.spark:spark-mllib-local_2.11:jar:2.2.0:compile
[INFO] |  +- org.scalanlp:breeze_2.11:jar:0.13.1:compile
[INFO] |  |  +- org.scalanlp:breeze-macros_2.11:jar:0.13.1:compile
[INFO] |  |  +- net.sf.opencsv:opencsv:jar:2.3:compile
[INFO] |  |  +- com.github.rwl:jtransforms:jar:2.4.0:compile
[INFO] |  |  +- org.spire-math:spire_2.11:jar:0.13.0:compile
[INFO] |  |  |  +- org.spire-math:spire-macros_2.11:jar:0.13.0:compile
[INFO] |  |  |  \- org.typelevel:machinist_2.11:jar:0.6.1:compile
[INFO] |  |  \- com.chuusai:shapeless_2.11:jar:2.3.2:compile
[INFO] |  |     \- org.typelevel:macro-compat_2.11:jar:1.1.1:compile
[INFO] |  +- org.apache.commons:commons-math3:jar:3.4.1:compile
[INFO] |  \- org.jpmml:pmml-model:jar:1.2.15:compile
[INFO] |     \- org.jpmml:pmml-schema:jar:1.2.15:compile
[INFO] +- org.apache.spark:spark-yarn_2.11:jar:2.2.0:compile
[INFO] |  +- org.apache.hadoop:hadoop-yarn-api:jar:2.8.1:compile
[INFO] |  |  \- org.apache.hadoop:hadoop-annotations:jar:2.8.1:compile
[INFO] |  |     \- jdk.tools:jdk.tools:jar:1.8:system
[INFO] |  +- org.apache.hadoop:hadoop-yarn-common:jar:2.8.1:compile
[INFO] |  |  +- org.apache.commons:commons-compress:jar:1.4.1:compile
[INFO] |  |  |  \- org.tukaani:xz:jar:1.0:compile
[INFO] |  |  +- org.codehaus.jackson:jackson-xc:jar:1.9.13:compile
[INFO] |  |  +- com.google.inject.extensions:guice-servlet:jar:3.0:compile
[INFO] |  |  \- com.google.inject:guice:jar:3.0:compile
[INFO] |  |     +- javax.inject:javax.inject:jar:1:compile
[INFO] |  |     \- aopalliance:aopalliance:jar:1.0:compile
[INFO] |  +- org.apache.hadoop:hadoop-yarn-server-web-proxy:jar:2.8.1:compile
[INFO] |  |  \- org.apache.hadoop:hadoop-yarn-server-common:jar:2.8.1:compile
[INFO] |  +- org.apache.hadoop:hadoop-yarn-client:jar:2.8.1:compile
[INFO] |  \- org.apache.hadoop:hadoop-client:jar:2.8.1:compile
[INFO] |     +- org.apache.hadoop:hadoop-common:jar:2.8.1:compile
[INFO] |     |  +- xmlenc:xmlenc:jar:0.52:compile
[INFO] |     |  +- javax.servlet.jsp:jsp-api:jar:2.1:runtime
[INFO] |     |  +- commons-configuration:commons-configuration:jar:1.6:compile
[INFO] |     |  |  +- commons-digester:commons-digester:jar:1.8:compile
[INFO] |     |  |  |  \- commons-beanutils:commons-beanutils:jar:1.7.0:compile
[INFO] |     |  |  \- commons-beanutils:commons-beanutils-core:jar:1.8.0:compile
[INFO] |     |  +- org.apache.hadoop:hadoop-auth:jar:2.8.1:compile
[INFO] |     |  |  +- com.nimbusds:nimbus-jose-jwt:jar:3.9:compile
[INFO] |     |  |  |  +- net.jcip:jcip-annotations:jar:1.0:compile
[INFO] |     |  |  |  \- net.minidev:json-smart:jar:1.1.1:compile
[INFO] |     |  |  \- org.apache.directory.server:apacheds-kerberos-codec:jar:2.0.0-M15:compile
[INFO] |     |  |     +- org.apache.directory.server:apacheds-i18n:jar:2.0.0-M15:compile
[INFO] |     |  |     +- org.apache.directory.api:api-asn1-api:jar:1.0.0-M20:compile
[INFO] |     |  |     \- org.apache.directory.api:api-util:jar:1.0.0-M20:compile
[INFO] |     |  +- org.apache.curator:curator-client:jar:2.7.1:compile
[INFO] |     |  \- org.apache.htrace:htrace-core4:jar:4.0.1-incubating:compile
[INFO] |     +- org.apache.hadoop:hadoop-hdfs:jar:2.8.1:compile
[INFO] |     |  \- org.apache.hadoop:hadoop-hdfs-client:jar:2.8.1:compile
[INFO] |     |     \- com.squareup.okhttp:okhttp:jar:2.4.0:compile
[INFO] |     |        \- com.squareup.okio:okio:jar:1.4.0:compile
[INFO] |     +- org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.8.1:compile
[INFO] |     |  +- org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.8.1:compile
[INFO] |     |  \- org.apache.hadoop:hadoop-mapreduce-client-shuffle:jar:2.8.1:compile
[INFO] |     +- org.apache.hadoop:hadoop-mapreduce-client-core:jar:2.8.1:compile
[INFO] |     \- org.apache.hadoop:hadoop-mapreduce-client-jobclient:jar:2.8.1:compile
[INFO] +- com.google.cloud:google-cloud-bigquery:jar:0.25.0-beta:compile
[INFO] |  +- com.google.cloud:google-cloud-core:jar:1.7.0:compile
[INFO] |  |  +- org.json:json:jar:20160810:compile
[INFO] |  |  +- com.google.http-client:google-http-client:jar:1.22.0:compile
[INFO] |  |  +- com.google.api:api-common:jar:1.1.0:compile
[INFO] |  |  +- com.google.api:gax:jar:1.8.1:compile
[INFO] |  |  |  +- com.google.auto.value:auto-value:jar:1.2:compile
[INFO] |  |  |  \- org.threeten:threetenbp:jar:1.3.3:compile
[INFO] |  |  +- com.google.protobuf:protobuf-java-util:jar:3.3.1:compile
[INFO] |  |  +- com.google.api.grpc:proto-google-common-protos:jar:0.1.20:compile
[INFO] |  |  \- com.google.api.grpc:proto-google-iam-v1:jar:0.1.20:compile
[INFO] |  +- com.google.cloud:google-cloud-core-http:jar:1.7.0:compile
[INFO] |  |  +- com.google.auth:google-auth-library-credentials:jar:0.8.0:compile
[INFO] |  |  +- com.google.auth:google-auth-library-oauth2-http:jar:0.8.0:compile
[INFO] |  |  +- com.google.http-client:google-http-client-appengine:jar:1.22.0:compile
[INFO] |  |  \- com.google.http-client:google-http-client-jackson:jar:1.22.0:compile
[INFO] |  \- com.google.apis:google-api-services-bigquery:jar:v2-rev347-1.22.0:compile
[INFO] +- com.google.cloud.bigdataoss:bigquery-connector:jar:0.10.2-hadoop2:compile
[INFO] |  +- org.apache.avro:avro-mapred:jar:hadoop2:1.7.7:compile
[INFO] |  |  +- org.apache.avro:avro-ipc:jar:1.7.7:compile
[INFO] |  |  |  +- org.apache.velocity:velocity:jar:1.7:compile
[INFO] |  |  |  \- org.mortbay.jetty:servlet-api:jar:2.5-20081211:compile
[INFO] |  |  \- org.apache.avro:avro-ipc:jar:tests:1.7.7:compile
[INFO] |  +- com.google.cloud.bigdataoss:util-hadoop:jar:1.6.1-hadoop2:compile
[INFO] |  +- com.google.cloud.bigdataoss:gcs-connector:jar:1.6.1-hadoop2:compile
[INFO] |  |  \- com.google.cloud.bigdataoss:gcsio:jar:1.6.1:compile
[INFO] |  +- com.google.apis:google-api-services-storage:jar:v1-rev35-1.20.0:compile
[INFO] |  +- com.google.code.gson:gson:jar:2.3:compile
[INFO] |  +- com.google.code.findbugs:jsr305:jar:2.0.3:compile
[INFO] |  +- com.google.oauth-client:google-oauth-client:jar:1.20.0:compile
[INFO] |  +- com.google.oauth-client:google-oauth-client-java6:jar:1.20.0:compile
[INFO] |  +- org.apache.avro:avro:jar:1.7.7:compile
[INFO] |  |  \- com.thoughtworks.paranamer:paranamer:jar:2.3:compile
[INFO] |  \- com.google.cloud.bigdataoss:util:jar:1.6.1:compile
[INFO] +- com.sparkjava:spark-core:jar:2.6.0:compile
[INFO] |  +- org.slf4j:slf4j-api:jar:1.7.13:compile
[INFO] |  +- org.eclipse.jetty:jetty-server:jar:9.4.4.v20170414:compile
[INFO] |  |  +- org.eclipse.jetty:jetty-http:jar:9.4.4.v20170414:compile
[INFO] |  |  |  \- org.eclipse.jetty:jetty-util:jar:9.4.4.v20170414:compile
[INFO] |  |  \- org.eclipse.jetty:jetty-io:jar:9.4.4.v20170414:compile
[INFO] |  +- org.eclipse.jetty:jetty-webapp:jar:9.4.4.v20170414:compile
[INFO] |  |  +- org.eclipse.jetty:jetty-xml:jar:9.4.4.v20170414:compile
[INFO] |  |  \- org.eclipse.jetty:jetty-servlet:jar:9.4.4.v20170414:compile
[INFO] |  |     \- org.eclipse.jetty:jetty-security:jar:9.4.4.v20170414:compile
[INFO] |  +- org.eclipse.jetty.websocket:websocket-server:jar:9.4.4.v20170414:compile
[INFO] |  |  +- org.eclipse.jetty.websocket:websocket-common:jar:9.4.4.v20170414:compile
[INFO] |  |  \- org.eclipse.jetty.websocket:websocket-client:jar:9.4.4.v20170414:compile
[INFO] |  |     \- org.eclipse.jetty:jetty-client:jar:9.4.4.v20170414:compile
[INFO] |  \- org.eclipse.jetty.websocket:websocket-servlet:jar:9.4.4.v20170414:compile
[INFO] |     \- org.eclipse.jetty.websocket:websocket-api:jar:9.4.4.v20170414:compile
[INFO] +- org.apache.spark:spark-streaming_2.11:jar:2.2.0:compile
[INFO] +- com.google.apis:google-api-services-analytics:jar:v3-rev142-1.23.0:compile
[INFO] +- com.google.apis:google-api-services-analyticsreporting:jar:v4-rev116-1.23.0:compile
[INFO] +- com.google.api-client:google-api-client:jar:1.23.0:compile
[INFO] |  \- com.google.http-client:google-http-client-jackson2:jar:1.23.0:compile
[INFO] +- org.tensorflow:tensorflow:jar:1.3.0:compile
[INFO] |  +- org.tensorflow:libtensorflow:jar:1.3.0:compile
[INFO] |  \- org.tensorflow:libtensorflow_jni:jar:1.3.0:compile
[INFO] \- com.google.api-client:google-api-client-jackson2:jar:1.23.0:compile
btilford commented 6 years ago

Should also note I only get the 404 when running locally. Running the job from DataProc works fine. :frowning_face:

Rolling back from <google-api-client.version>1.23.0</google-api-client.version> to <google-api-client.version>1.22.0</google-api-client.version> "fixes" it.

nicktrav commented 6 years ago

We also ran into authentication issues. The issue was exposed when updating to google-api-client.version 1.23.0:

Caused by: com.google.api.client.http.HttpResponseException: 404 Not Found
Not Found
    at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1070)
    at com.google.api.client.googleapis.batch.BatchRequest.execute(BatchRequest.java:241)
    at org.apache.beam.sdk.util.GcsUtil$3.call(GcsUtil.java:588)
    at org.apache.beam.sdk.util.GcsUtil$3.call(GcsUtil.java:586)
    at org.apache.beam.sdks.java.extensions.google.cloud.platform.core.repackaged.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:111)
    at org.apache.beam.sdks.java.extensions.google.cloud.platform.core.repackaged.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:58)
    at org.apache.beam.sdks.java.extensions.google.cloud.platform.core.repackaged.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:75)
    ... 3 more

I was able to isolate the commit that broke it, if it's any help: https://github.com/google/google-api-java-client/commit/22e76833e8e02e5594fc949c749fbfbbd780fb2f

Maybe it's better for me to post that as an issue on that repo?

andreamlin commented 6 years ago

@nicktrav Would you mind copying over your comment to a new issue on the google-api-java-client repo?

nicktrav commented 6 years ago

@andreamlin - sure thing! I made https://github.com/google/google-api-java-client/issues/1073.

lbergelson commented 6 years ago

@lukecwik Anything we can do to move this forward?

garrettjonesgoogle commented 6 years ago

Everyone - we are going to do a release today which pulls in google-auth-library-java version 0.9.0 which fixes the bug introduced in prior versions.

lukecwik commented 6 years ago

@garrettjonesgoogle Does the issue also apply to the Apiary vended libraries?

Can you explain what the fix is (it is unclear looking at the github commit history for google-auth-library-java)?

garrettjonesgoogle commented 6 years ago

This is the fix: https://github.com/google/google-auth-library-java/pull/132

It doesn't apply to the Apiary vended libraries.

lbergelson commented 6 years ago

@garrettjonesgoogle That's excellent news. Thanks for the update!

droazen commented 6 years ago

@jean-philippe-martin Would you be able to try out this fix using your repro above?

anthmgoogle commented 6 years ago

Looks like this is fixed in the auth library. Please reactivate if not.

jean-philippe-martin commented 6 years ago

I changed my repro program's build gradle to now include the new version:

compile 'com.google.cloud:google-cloud-nio:0.27.0-alpha:shaded'

Ran the program, and it failed again:

[Stage 0:>                                                          (0 + 1) / 2]17/11/08 23:10:06 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, jps-test-cluster-2-w-0.c.jps-testing-project.internal, executor 1): com.google.cloud.storage.StorageException: Error code 404 trying to get security access token from Compute Engine metadata for the default service account. This may be because the virtual machine instance does not have permission scopes specified.
    at com.google.cloud.storage.spi.v1.HttpStorageRpc.translate(HttpStorageRpc.java:189)
    at com.google.cloud.storage.spi.v1.HttpStorageRpc.get(HttpStorageRpc.java:340)
    at com.google.cloud.storage.StorageImpl$5.call(StorageImpl.java:197)
    at com.google.cloud.storage.StorageImpl$5.call(StorageImpl.java:194)
    at shaded.cloud_nio.com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:91)
    at com.google.cloud.RetryHelper.run(RetryHelper.java:74)
    at com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:51)
    at com.google.cloud.storage.StorageImpl.get(StorageImpl.java:194)
    at com.google.cloud.storage.contrib.nio.CloudStorageFileSystemProvider.checkAccess(CloudStorageFileSystemProvider.java:614)
    at java.nio.file.Files.exists(Files.java:2385)
    at repro_package.Main.exists(Main.java:40)

For what it's worth, it seems I'm correctly using the new auth library:

$ gradle dependencies --configuration compile | grep 'auth' 
|    |    |    |    +--- com.google.auth:google-auth-library-oauth2-http:0.9.0
|    |    |    |    |    +--- com.google.auth:google-auth-library-credentials:0.9.0
|    |    |    +--- com.google.auth:google-auth-library-credentials:0.9.0
|    |    |    +--- com.google.auth:google-auth-library-oauth2-http:0.9.0 (*)
|    |    |    +--- com.google.oauth-client:google-oauth-client:1.23.0
|    |    |    |    +--- com.google.oauth-client:google-oauth-client:1.23.0 (*)
     |    |    |    +--- org.apache.hadoop:hadoop-auth:2.2.0

and the new version of google-cloud:

$ gradle dependencies --configuration compile | grep 'google-cloud'
+--- com.google.cloud:google-cloud-nio:0.27.0-alpha
|    +--- com.google.cloud:google-cloud-storage:1.9.0
|    |    +--- com.google.cloud:google-cloud-core:1.9.0
|    |    +--- com.google.cloud:google-cloud-core-http:1.9.0
|    |    |    +--- com.google.cloud:google-cloud-core:1.9.0 (*)

If you'd like to suggest a way for me to share the whole program in a way that you can run it as well and see for yourself, I'm open to suggestions.

jean-philippe-martin commented 6 years ago

@anthmgoogle I don't seem to be able to reopen this issue myself, but perhaps it should be.

lbergelson commented 6 years ago

So our most recent assumption was that this error was being caused by a configuration issue in our projects, but we've now tested with different google projects that don't share the same owners or configuration and the error reproduces in both of them.

See https://github.com/broadinstitute/gatk/pull/3855 for the complete discussion. It's unclear what we should do to continue debugging this. @anthmgoogle Any suggestions?

anthmgoogle commented 6 years ago

If we can no longer reproduce, I suggest we close.

jean-philippe-martin commented 6 years ago

@anthmgoogle

If we can no longer reproduce, I suggest we close.

That is literally the opposite of what was said. We reproduced the bug and ran out of ideas for how to make the test not fail.

anthmgoogle commented 6 years ago

Apologies. Reactivating and reassigning.

yihanzhen commented 6 years ago

Hi @jean-philippe-martin -

I suppose you are still using the repro steps in this comment, and having the same result in this comment for google-cloud-nio version 0.27.0-alpha, but please correct me if I am wrong.

It seems your repro throws an exception when calling Files.exists(), so can you please remove the dependency of Spark (since it seems to be irrelevant to where the code fails) and create a simpler repro which may look similar to the code below:

public static void main(String[] args) {
  Storage storage = StorageOptions.getDefaultInstance().getService();
  String BUCKET_NAME = "this-is-my-bucket-for-testing";
  String FILE_NAME = "test-file-name.txt";
  int FILE_SIZE = 10;
  String FAKE_FILE_NAME = "fake-file-name.txt";

  BlobInfo blobInfo = BlobInfo.newBuilder(BUCKET_NAME, FILE_NAME).build();
  storage.create(BucketInfo.of(BUCKET_NAME));
  storage.create(blobInfo, randomContents(FILE_SIZE));

  CloudStorageFileSystem csfs = CloudStorageFileSystem.forBucket(BUCKET_NAME);
  Path path = csfs.getPath(FILE_NAME);
  Path fakePath = csfs.getPath(FAKE_FILE_NAME);
  System.out.println(FILE_NAME + "exists: " + Files.exists(path));
  System.out.println(FAKE_FILE_NAME + "exists: " + Files.exists(fakePath));

  storage.get(blobInfo.getBlobId()).delete();
  storage.delete(BUCKET_NAME);
}

private static byte[] randomContents(int size) {
  byte[] bytes = new byte[size];
  new Random(size).nextBytes(bytes);
  return bytes;
}

I can run this code without exception and have the correct result printed when I am using google-cloud-nio version 0.27.0-alpha. Can you please try this code snippet and share your outcome here? If you don't mind, can you also make sure that you have set up project-id and authentication correctly, and share how you specified them in your repro/project? Thanks for your time looking into these.

jean-philippe-martin commented 6 years ago

@hzyi-google in my latest attempt I actually ran full-on GATK4 with the error as described in this comment.

I was able to run your provided code with google-cloud-nio version 0.27.0-alpha and it worked just fine, as expected. I also tried it with com.google.cloud:google-cloud-nio:0.30.0-alpha:shaded and again it worked exactly as expected. Here's the output, for reference:

test-file-name.txtexists: true
fake-file-name.txtexists: false

We only run into trouble when we try to run code (even something as basic as Files.exists) from within a Spark node on Google Dataflow. The dependency to Spark is very much relevant.

jean-philippe-martin commented 6 years ago

For what it's worth, I also reran the repro from this thread and ran into the same issue: "com.google.cloud.storage.StorageException: Error code 404 trying to get security access token from Compute Engine metadata for the default service account. This may be because the virtual machine instance does not have permission scopes specified."

I set up the project and auth like this:

$ export GOOGLE_CLOUD_PROJECT=jps-testing-project
$ export GOOGLE_APPLICATION_CREDENTIALS=~/secrets/jp-testing-account@jps-testing-project.json 

That was with com.google.cloud:google-cloud-nio:0.30.0-alpha:shaded.

I also tried with com.google.cloud:google-cloud-nio:0.20.0-alpha:shaded and it worked fine, as it did before with that version. I tried a few more.

Version Outcome
0.30.0 failure
0.27.0 failure
0.23.0 failure
0.22.0 success
0.20.0 success
yihanzhen commented 6 years ago

Hi @jean-philippe-martin Thanks for the information.

One thing is if your project uses dataflow-java-sdk, please stick to google-cloud-java 0.22 for the moment. The reason is that google-api-services, a library both DataflowJavaSDK and Google-cloud-java have dependencies on, currently have compatibility issues between two versions. Dataflow will have updates in the near future to support newer version of google-api-services, and you will be able to upgrade google-cloud-java to newer version for your project then.

For your repro from this thread, unfortunately I was not able to get it run on my local machine. Spark complains about "Unable to load YARN support: org.apache.spark.deploy.yarn.YarnSparkHadoopUtil". I am not familiar with Spark but if you can provide more context about how I can get the repro run, I can take a look at the endpoint it hits from the server side.

Also, if I change to final SparkConf sparkConf = new SparkConf().setAppName("repro_package").setMaster("local"); I am able to run it without error. Can you take a look at that and see if it is relevant or not?

jean-philippe-martin commented 6 years ago

Thank you @hzyi-google! We are not using dataflow, we're using Google Cloud Dataproc instead but given what we've seen, the same compatibility issue may also be present between these two Google products. I suggest your team contacts their team to address the issue.

For the repro, did you start a dataproc cluster? You have to go to cloud.google.com, click on "dataproc" on the left menu, select "clusters", then "create". A minimal cluster suffices (say, one master and two workers, with a single CPU each).

Then start the program like this:

gcloud dataproc jobs submit spark --cluster $NAME_YOU_CHOSE_FOR_THE_CLUSTER \
 --jar build/libs/nio-auth-repro-package-1.0-spark.jar -- --sparkMaster yarn
yihanzhen commented 6 years ago

@jean-philippe-martin Thanks for the repro steps. The Dataproc team has been notified, and the problem will be fixed in their next version.

jean-philippe-martin commented 6 years ago

Thank you! You say they found the problem and have a fix already for the next version? Wow that's fast!

yihanzhen commented 6 years ago

Closing it for now since nothing needs to be fixed in the repository. Feel free to comment and ask for reactivating if there are other problems related to this issue.

lbergelson commented 6 years ago

Thanks you for the help!

droazen commented 6 years ago

@hzyi-google Unfortunately this issue still persists for us running on Google Dataproc using the latest google-cloud-java release (0.47.0-alpha:shaded) and the latest Dataproc image (1.2.34). Could you please re-open this issue? I don't think the compatibility issues between google-cloud-java and Google Dataproc have been resolved. Our project is stuck on an ancient fork of google-cloud-java 0.20.4-alpha because of it, and it's starting to cause major issues for us.

Thanks!

(@jean-philippe-martin pinging you as well on this)

yihanzhen commented 6 years ago

Apologies that the issue still exists. I'll take a look again.

yihanzhen commented 6 years ago

Here is what Dataproc team mentioned from an internal thread:

... can make use of the following init action to install newer versions of the GCS and BigQuery connectors onto their cluster. The latest versions of the connectors, 1.8.1 for the GCS connector and 0.12.1 for the BigQuery Connector, better isolate the API client libraries used by the connectors so that customers can bring their own versions of the API client libraries without interfering with the connector itself.

Please let me know if this helps.

droazen commented 6 years ago

@hzyi-google Thanks for the suggestion. I tried running with the updated GCS connector version by creating a Dataproc cluster using the following command:

gcloud dataproc clusters create droazen-test-cluster --initialization-actions gs://dataproc-initialization-actions/connectors/connectors.sh --metadata 'gcs-connector-version=1.8.1' --metadata 'bigquery-connector-version=0.12.1' --num-workers 4 --num-masters 1 --image-version 1.2

I then built a version of our toolkit that depends on com.google.cloud:google-cloud-nio:0.47.0-alpha:shaded (the newest release of google-cloud-java, which is what we're trying to update to). I still got an error, but it's different from the 404 error we've been getting all along:

=========== Cloud Dataproc Agent Error ===========
java.io.IOException: Error accessing: bucket: dataproc-8cbe9d51-94fb-4ad4-9c34-a283212c2ae6-us
    at com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl.wrapException(GoogleCloudStorageImpl.java:1857)
    at com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl.getBucket(GoogleCloudStorageImpl.java:1814)
    at com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl.getItemInfo(GoogleCloudStorageImpl.java:1763)
    at com.google.cloud.hadoop.gcsio.GoogleCloudStorageFileSystem.getFileInfo(GoogleCloudStorageFileSystem.java:1144)
    at com.google.cloud.hadoop.gcsio.GoogleCloudStorageFileSystem.exists(GoogleCloudStorageFileSystem.java:448)
    at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.configureBuckets(GoogleHadoopFileSystemBase.java:1967)
    at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem.configureBuckets(GoogleHadoopFileSystem.java:69)
    at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.configure(GoogleHadoopFileSystemBase.java:1915)
    at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.initialize(GoogleHadoopFileSystemBase.java:1064)
    at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.initialize(GoogleHadoopFileSystemBase.java:1027)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2812)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2849)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2831)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:356)
    at com.google.cloud.hadoop.services.agent.util.HadoopUtil.getFs(HadoopUtil.java:63)
    at com.google.cloud.hadoop.services.agent.util.HadoopUtil.download(HadoopUtil.java:70)
    at com.google.cloud.hadoop.services.agent.job.AbstractJobHandler.downloadResources(AbstractJobHandler.java:424)
    at com.google.cloud.hadoop.services.agent.job.AbstractJobHandler$StartDriver.call(AbstractJobHandler.java:543)
    at com.google.cloud.hadoop.services.agent.job.AbstractJobHandler$StartDriver.call(AbstractJobHandler.java:532)
    at com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
    at com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
    at com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: javax.net.ssl.SSLException: java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty
    at sun.security.ssl.Alerts.getSSLException(Alerts.java:208)
    at sun.security.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1964)
    at sun.security.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1921)
    at sun.security.ssl.SSLSocketImpl.handleException(SSLSocketImpl.java:1904)
    at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1420)
    at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1397)
    at sun.net.www.protocol.https.HttpsClient.afterConnect(HttpsClient.java:559)
    at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:185)
    at sun.net.www.protocol.https.HttpsURLConnectionImpl.connect(HttpsURLConnectionImpl.java:162)
    at com.google.api.client.http.javanet.NetHttpRequest.execute(NetHttpRequest.java:93)
    at com.google.api.client.http.HttpRequest.execute(HttpRequest.java:972)
    at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:419)
    at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
    at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
    at com.google.cloud.hadoop.gcsio.GoogleCloudStorageImpl.getBucket(GoogleCloudStorageImpl.java:1808)
    ... 29 more
Caused by: java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty
    at sun.security.validator.PKIXValidator.<init>(PKIXValidator.java:91)
    at sun.security.validator.Validator.getInstance(Validator.java:179)
    at sun.security.ssl.X509TrustManagerImpl.getValidator(X509TrustManagerImpl.java:312)
    at sun.security.ssl.X509TrustManagerImpl.checkTrustedInit(X509TrustManagerImpl.java:171)
    at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:184)
    at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:124)
    at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1596)
    at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:216)
    at sun.security.ssl.Handshaker.processLoop(Handshaker.java:1052)
    at sun.security.ssl.Handshaker.process_record(Handshaker.java:987)
    at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1072)
    at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1385)
    at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1413)
    ... 39 more
Caused by: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty
    at java.security.cert.PKIXParameters.setTrustAnchors(PKIXParameters.java:200)
    at java.security.cert.PKIXParameters.<init>(PKIXParameters.java:120)
    at java.security.cert.PKIXBuilderParameters.<init>(PKIXBuilderParameters.java:104)
    at sun.security.validator.PKIXValidator.<init>(PKIXValidator.java:89)
    ... 51 more
======== End of Cloud Dataproc Agent Error ========

I noticed, however, that we do have a copy of the GCS connector in our classpath. I tried excluding it from our jar completely, and got this error instead:

=========== Cloud Dataproc Agent Error ===========
java.lang.NullPointerException
    at com.google.api.client.util.SecurityUtils.loadKeyStore(SecurityUtils.java:84)
    at com.google.api.client.googleapis.GoogleUtils.getCertificateTrustStore(GoogleUtils.java:76)
    at com.google.cloud.hadoop.util.HttpTransportFactory.createNetHttpTransport(HttpTransportFactory.java:136)
    at com.google.cloud.hadoop.util.HttpTransportFactory.newTrustedTransport(HttpTransportFactory.java:146)
    at com.google.cloud.hadoop.util.CredentialFactory.getStaticHttpTransport(CredentialFactory.java:200)
    at com.google.cloud.hadoop.util.CredentialFactory.getCredentialFromMetadataServiceAccount(CredentialFactory.java:215)
    at com.google.cloud.hadoop.util.CredentialConfiguration.getCredential(CredentialConfiguration.java:75)
    at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.configure(GoogleHadoopFileSystemBase.java:1875)
    at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.initialize(GoogleHadoopFileSystemBase.java:1064)
    at com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystemBase.initialize(GoogleHadoopFileSystemBase.java:1027)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2812)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2849)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2831)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:356)
    at com.google.cloud.hadoop.services.agent.util.HadoopUtil.getFs(HadoopUtil.java:63)
    at com.google.cloud.hadoop.services.agent.util.HadoopUtil.download(HadoopUtil.java:70)
    at com.google.cloud.hadoop.services.agent.job.AbstractJobHandler.downloadResources(AbstractJobHandler.java:424)
    at com.google.cloud.hadoop.services.agent.job.AbstractJobHandler$StartDriver.call(AbstractJobHandler.java:543)
    at com.google.cloud.hadoop.services.agent.job.AbstractJobHandler$StartDriver.call(AbstractJobHandler.java:532)
    at com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
    at com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
    at com.google.cloud.hadoop.services.repackaged.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
======== End of Cloud Dataproc Agent Error ========

Did I create the cluster correctly? Are these errors at all revealing as to what might be going on?

droazen commented 6 years ago

@hzyi-google Any updates from the Dataproc team on this issue?