Closed saragluna closed 6 years ago
cc @TimeExceed
It seems jar of protobuf in your classpath conflicts with that required by tablestore. You should download jar-with-dependencies, e.g, tablestore-4.1.0-jar-with-dependencies.jar, which inlines protobuf and does not depend on external jar of protobuf.
@uncleGen @TimeExceed
碰到了类似的问题, 打的jar包无论有无加依赖的 protobuf 都是相同的报错
spark + tablestore
https://help.aliyun.com/document_detail/52880.html?spm=5176.product28066.6.656.G8QA0X
Exception in thread "main" java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses. at com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180) at com.alicloud.openservices.tablestore.core.protocol.OtsInternalApi$DescribeTableRequest.getSerializedSize(OtsInternalApi.java:9710) at com.google.protobuf.AbstractMessageLite.toByteArray(AbstractMessageLite.java:62) at com.alicloud.openservices.tablestore.core.OperationLauncher.asyncInvokePost(OperationLauncher.java:116) at com.alicloud.openservices.tablestore.core.DescribeTableLauncher.fire(DescribeTableLauncher.java:52) at com.alicloud.openservices.tablestore.InternalClient.describeTable(InternalClient.java:233) at com.alicloud.openservices.tablestore.SyncClient.describeTable(SyncClient.java:114) at com.aliyun.openservices.tablestore.hadoop.TableStoreInputFormat.fetchSplits(TableStoreInputFormat.java:250) at com.aliyun.openservices.tablestore.hadoop.TableStoreInputFormat.getSplits(TableStoreInputFormat.java:181) at com.aliyun.openservices.tablestore.hadoop.TableStoreInputFormat.getSplits(TableStoreInputFormat.java:167) at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:125) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:252) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:250) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.rdd.RDD.partitions(RDD.scala:250) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2094) at org.apache.spark.rdd.RDD.count(RDD.scala:1158) at org.apache.spark.api.java.JavaRDDLike$class.count(JavaRDDLike.scala:455) at org.apache.spark.api.java.AbstractJavaRDDLike.count(JavaRDDLike.scala:45) at com.laiye.tools.RowCounter.main(RowCounter.java:39) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498)
Encountered exception while following the instructions on Hive+Tablestore(https://github.com/aliyun/aliyun-emapreduce-sdk/blob/master/docs/Hive-SparkSQL-on-TableStore.md).
Here is the stacktrace:
Caused by: java.lang.UnsupportedOperationException: This is supposed to be overridden by subclasses. at com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180) at com.google.protobuf.TextFormat$Printer.print(TextFormat.java:275) at com.google.protobuf.TextFormat$Printer.access$400(TextFormat.java:248) at com.google.protobuf.TextFormat.print(TextFormat.java:71) at com.google.protobuf.TextFormat.printToString(TextFormat.java:118) at com.google.protobuf.AbstractMessage.toString(AbstractMessage.java:106) at com.alicloud.openservices.tablestore.core.OperationLauncher.asyncInvokePost(OperationLauncher.java:112) at com.alicloud.openservices.tablestore.core.DescribeTableLauncher.fire(DescribeTableLauncher.java:52) at com.alicloud.openservices.tablestore.InternalClient.describeTable(InternalClient.java:229) at com.alicloud.openservices.tablestore.SyncClient.describeTable(SyncClient.java:110) at com.aliyun.openservices.tablestore.hive.TableStoreInputFormat.fetchTableMeta(TableStoreInputFormat.java:139) at com.aliyun.openservices.tablestore.hive.TableStoreInputFormat.getSplits(TableStoreInputFormat.java:71)