Open rashid-1989 opened 6 years ago
@weiqingy Could you please suggest if i am missing something here?
I came across the same issue.
Could you resolve it?
can you attach your latest code?
Please refer the about snippet followed by the exception. Thats the code I am trying to execute.
@louisliu318 do you need any additional details on the above?
I am stuck here. Can someone please suggest?
Same error on scala.
at scala.None$.get(Option.scala:347)
at scala.None$.get(Option.scala:345)
at org.apache.spark.sql.execution.datasources.hbase.HBaseTableCatalog$.apply(HBaseTableCatalog.scala:277)
at org.apache.spark.sql.execution.datasources.hbase.HBaseRelation.<init>(HBaseRelation.scala:60)
at org.apache.spark.sql.execution.datasources.hbase.DefaultSource.createRelation(DefaultSource.scala:24)
at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:518)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:215)
at org.apache.spark.sql.execution.datasources.hbase.examples.HBaseSource$.main(HBaseSource.scala:107)
at org.apache.spark.sql.execution.datasources.hbase.examples.HBaseSource.main(HBaseSource.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:744)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Same error!
Map<String, String> putMap = new HashMap(); map.put(HBaseTableCatalog.tableCatalog(), newHbaseCatalog); map.put(HBaseTableCatalog.newTable(), "1");
df.write().options(putMap) .format("org.apache.spark.sql.execution.datasources.hbase") .save();
you shoulde change putMap to map @rashid-1989
Hi, I am using below code to write a Dataframe to hbase table:
import java.util.HashMap; import java.util.Map;
import org.apache.spark.SparkContext; import org.apache.spark.sql.Dataset; import org.apache.spark.sql.SQLContext; import org.apache.spark.sql.SparkSession; import org.apache.spark.sql.execution.datasources.hbase.HBaseTableCatalog;
public class connTest {
String newHbaseCatalog = "{\r\n"
"\"REGION\":{\"cf\":\"general\", \"col\":\"region\", \"type\":\"string\"}\r\n" + "}\r\n" + "}";
Map<String, String> putMap = new HashMap(); map.put(HBaseTableCatalog.tableCatalog(), newHbaseCatalog); map.put(HBaseTableCatalog.newTable(), "1");
df.write().options(putMap) .format("org.apache.spark.sql.execution.datasources.hbase") .save(); }
public static void main(String[] args) {
SparkSession spark = SparkSession.builder().appName("test").master("local[*]") .config("spark.sql.warehouse.dir", "file:///c:/tmp/spark-warehouse") .config("hbase.client.retries.number", "2") .getOrCreate(); SparkContext sc = spark.sparkContext(); System.out.println("printing table contents"); withCatalog(hbaseCatalog);
}
}
And getting the below exception: