Thanks for stopping by to let us know something could be better!
PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.
Please run down the following list and make sure you've tried the usual "quick fixes":
java.lang.IllegalStateException: Could not find an appropriate constructor for com.google.cloud.bigtable.hbase2_x.BigtableConnection
at com.google.cloud.bigtable.hbase.BigtableConfiguration.connect(BigtableConfiguration.java:200) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
at com.test.SparkJob.lambda$runSparkJob$0(SparkJob.java:88) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
at java.util.Iterator.forEachRemaining(Iterator.java:133) ~[?:?]
at scala.collection.convert.Wrappers$IteratorWrapper.forEachRemaining(Wrappers.scala:31) ~[scala-library-2.12.18.jar:?]
at com.test.SparkJob.lambda$runSparkJob$e3b46054$1(SparkJob.java:73) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
at org.apache.spark.api.java.JavaRDDLike.$anonfun$foreachPartition$1(JavaRDDLike.scala:219) ~[spark-core_2.12-3.3.2.jar:3.3.2]
at org.apache.spark.api.java.JavaRDDLike.$anonfun$foreachPartition$1$adapted(JavaRDDLike.scala:219) ~[spark-core_2.12-3.3.2.jar:3.3.2]
at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2(RDD.scala:1011) ~[spark-core_2.12-3.3.2.jar:3.3.2]
at org.apache.spark.rdd.RDD.$anonfun$foreachPartition$2$adapted(RDD.scala:1011) ~[spark-core_2.12-3.3.2.jar:3.3.2]
at org.apache.spark.SparkContext.$anonfun$runJob$5(SparkContext.scala:2333) ~[spark-core_2.12-3.3.2.jar:3.3.2]
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) ~[spark-core_2.12-3.3.2.jar:3.3.2]
at org.apache.spark.scheduler.Task.run(Task.scala:136) ~[spark-core_2.12-3.3.2.jar:3.3.2]
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548) ~[spark-core_2.12-3.3.2.jar:3.3.2]
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1505) ~[spark-core_2.12-3.3.2.jar:3.3.2]
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551) ~[spark-core_2.12-3.3.2.jar:3.3.2]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?]
at java.lang.Thread.run(Thread.java:829) ~[?:?]
Caused by: java.lang.reflect.InvocationTargetException
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Constructor.java:490) ~[?:?]
at com.google.cloud.bigtable.hbase.BigtableConfiguration.connect(BigtableConfiguration.java:197) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
... 17 more
Caused by: java.lang.IllegalAccessError: class com.google.iam.v1.TestIamPermissionsRequest tried to access method 'com.google.protobuf.LazyStringArrayList com.google.protobuf.LazyStringArrayList.emptyList()' (com.google.iam.v1.TestIamPermissionsRequest is in unnamed module of loader org.apache.spark.util.MutableURLClassLoader @37468787; com.google.protobuf.LazyStringArrayList is in unnamed module of loader 'app')
at com.google.iam.v1.TestIamPermissionsRequest.<init>(TestIamPermissionsRequest.java:127) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
at com.google.iam.v1.TestIamPermissionsRequest.<clinit>(TestIamPermissionsRequest.java:918) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
at com.google.bigtable.admin.v2.BigtableTableAdminGrpc.getTestIamPermissionsMethod(BigtableTableAdminGrpc.java:995) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
at com.google.cloud.bigtable.grpc.BigtableTableAdminGrpcClient.<init>(BigtableTableAdminGrpcClient.java:186) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
at com.google.cloud.bigtable.grpc.BigtableSession.<init>(BigtableSession.java:299) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
at org.apache.hadoop.hbase.client.AbstractBigtableConnection.<init>(AbstractBigtableConnection.java:123) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
at org.apache.hadoop.hbase.client.AbstractBigtableConnection.<init>(AbstractBigtableConnection.java:88) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
at com.google.cloud.bigtable.hbase2_x.BigtableConnection.<init>(BigtableConnection.java:56) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Constructor.java:490) ~[?:?]
at com.google.cloud.bigtable.hbase.BigtableConfiguration.connect(BigtableConfiguration.java:197) ~[gcp-service-impl-0.0.1-SNAPSHOT.jar:?]
... 17 more
Thanks for stopping by to let us know something could be better!
PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.
Please run down the following list and make sure you've tried the usual "quick fixes":
If you are still having issues, please include as much information as possible:
Environment details
Steps to reproduce
In Spark Job, connect to Bigtable Configuration config = HBaseConfiguration.create(); config.set("google.bigtable.project.id",);
config.set("google.bigtable.instance.id", );
Credentials credentials = GoogleCredentials.fromStream();
Connection connection = BigtableConfiguration.connect( BigtableConfiguration.withCredentials(config, credentials));
Code example
Stack trace
External references such as API reference guides
Dependencies :
Any additional information below
Same code works when we use dataproc 2.2.6-debian12
Following these steps guarantees the quickest resolution possible.
Thanks!