googleapis / java-bigtable-hbase

Java libraries and HBase client extensions for accessing Google Cloud Bigtable
https://cloud.google.com/bigtable/
Apache License 2.0
174 stars 179 forks source link

java.lang.IllegalStateException: Could not find an appropriate constructor for com.google.cloud.bigtable.hbase2_x.BigtableConnection #4309

Open sagarsitap596 opened 6 months ago

sagarsitap596 commented 6 months ago

Thanks for stopping by to let us know something could be better!

PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.

Please run down the following list and make sure you've tried the usual "quick fixes":

If you are still having issues, please include as much information as possible:

Environment details

  1. Specify the API at the beginning of the title. General, Core, and Other are also allowed as types
  2. OS type and version:
  3. Java version:
  4. Version(s):

Steps to reproduce

  1. Create dataproc cluster 2.1-debian11
  2. In Spark Job, connect to Bigtable Configuration config = HBaseConfiguration.create(); config.set("google.bigtable.project.id", ); config.set("google.bigtable.instance.id", );

    Credentials credentials = GoogleCredentials.fromStream(); Connection connection = BigtableConfiguration.connect( BigtableConfiguration.withCredentials(config, credentials));

  1. Submit spakrJob in dataproc cluster 2.1-debian11

Code example

 config.set("google.bigtable.project.id",  <projectId>);
 config.set("google.bigtable.instance.id", <Instance>);

  Credentials credentials = GoogleCredentials.fromStream(<Service account json file>);
 Connection connection = BigtableConfiguration.connect( BigtableConfiguration.withCredentials(config, credentials));

Stack trace

java.lang.IllegalStateException: Could not find an appropriate constructor for com.google.cloud.bigtable.hbase2_x.BigtableConnection
    at com.google.cloud.bigtable.hbase.BigtableConfiguration.connect(BigtableConfiguration.java:200) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
    at com.test.SparkJob.lambda$runSparkJob$0(SparkJob.java:88) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
    at java.util.Iterator.forEachRemaining(Iterator.java:133) ~[?:?]
    at scala.collection.convert.Wrappers$IteratorWrapper.forEachRemaining(Wrappers.scala:31) ~[scala-library-2.12.18.jar:?]
    at com.test.SparkJob.lambda$runSparkJob$e3b46054$1(SparkJob.java:73) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
    at org.apache.spark.api.java.JavaRDDLike.$anonfun$foreachPartition$1(JavaRDDLike.scala:219) ~[spark-core_2.12-3.3.2.jar:3.3.2]
    at org.apache.spark.api.java.JavaRDDLike.$anonfun$foreachPartition$1$adapted(JavaRDDLike.scala:219) ~[spark-core_2.12-3.3.2.jar:3.3.2]
    at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2(RDD.scala:1011) ~[spark-core_2.12-3.3.2.jar:3.3.2]
    at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2$adapted(RDD.scala:1011) ~[spark-core_2.12-3.3.2.jar:3.3.2]
    at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2333) ~[spark-core_2.12-3.3.2.jar:3.3.2]
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) ~[spark-core_2.12-3.3.2.jar:3.3.2]
    at org.apache.spark.scheduler.Task.run(Task.scala:136) ~[spark-core_2.12-3.3.2.jar:3.3.2]
    at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548) ~[spark-core_2.12-3.3.2.jar:3.3.2]
    at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1505) ~[spark-core_2.12-3.3.2.jar:3.3.2]
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551) ~[spark-core_2.12-3.3.2.jar:3.3.2]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?]
    at java.lang.Thread.run(Thread.java:829) ~[?:?]
Caused by: java.lang.reflect.InvocationTargetException
    at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
    at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:?]
    at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
    at java.lang.reflect.Constructor.newInstance(Constructor.java:490) ~[?:?]
    at com.google.cloud.bigtable.hbase.BigtableConfiguration.connect(BigtableConfiguration.java:197) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
    ... 17 more
Caused by: java.lang.IllegalAccessError: class com.google.iam.v1.TestIamPermissionsRequest tried to access method 'com.google.protobuf.LazyStringArrayList com.google.protobuf.LazyStringArrayList.emptyList()' (com.google.iam.v1.TestIamPermissionsRequest is in unnamed module of loader org.apache.spark.util.MutableURLClassLoader @37468787; com.google.protobuf.LazyStringArrayList is in unnamed module of loader 'app')
    at com.google.iam.v1.TestIamPermissionsRequest.<init>(TestIamPermissionsRequest.java:127) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
    at com.google.iam.v1.TestIamPermissionsRequest.<clinit>(TestIamPermissionsRequest.java:918) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
    at com.google.bigtable.admin.v2.BigtableTableAdminGrpc.getTestIamPermissionsMethod(BigtableTableAdminGrpc.java:995) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
    at com.google.cloud.bigtable.grpc.BigtableTableAdminGrpcClient.<init>(BigtableTableAdminGrpcClient.java:186) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
    at com.google.cloud.bigtable.grpc.BigtableSession.<init>(BigtableSession.java:299) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
    at org.apache.hadoop.hbase.client.AbstractBigtableConnection.<init>(AbstractBigtableConnection.java:123) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
    at org.apache.hadoop.hbase.client.AbstractBigtableConnection.<init>(AbstractBigtableConnection.java:88) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
    at com.google.cloud.bigtable.hbase2_x.BigtableConnection.<init>(BigtableConnection.java:56) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
    at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
    at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:?]
    at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
    at java.lang.reflect.Constructor.newInstance(Constructor.java:490) ~[?:?]
    at com.google.cloud.bigtable.hbase.BigtableConfiguration.connect(BigtableConfiguration.java:197) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
    ... 17 more

External references such as API reference guides

Dependencies :

org.apache.spark spark-core_2.12 3.3.2
<!-- Spark SQL -->
<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-sql_2.12</artifactId>
  <version>3.3.2</version>
</dependency>

<!-- Hadoop Common -->
<dependency>
  <groupId>org.apache.hadoop</groupId>
  <artifactId>hadoop-common</artifactId>
  <version>3.3.6</version>
</dependency>

<!-- Hadoop HDFS -->
<dependency>
  <groupId>org.apache.hadoop</groupId>
  <artifactId>hadoop-hdfs</artifactId>
  <version>3.3.6</version>
</dependency>

<!-- Google Cloud Bigtable HBase Client -->
<dependency>
  <groupId>com.google.cloud.bigtable</groupId>
  <artifactId>bigtable-hbase-2.x</artifactId>
  <version>1.17.0</version>
</dependency>

<!-- Google Cloud Storage (Optional) -->
<!-- https://mvnrepository.com/artifact/com.google.cloud/google-cloud-storage -->
<dependency>
  <groupId>com.google.cloud</groupId>
  <artifactId>google-cloud-storage</artifactId>
  <version>2.34.0</version>
</dependency>

Any additional information below

Same code works when we use dataproc 2.2.6-debian12

Following these steps guarantees the quickest resolution possible.

Thanks!