2) I created the following partitioned table enodebpartition as shown
create table enodebpartition (circle string, businessranking string,
enodebstatus string, shape binary)partitioned by (state string)
ROW FORMAT SERDE 'com.esri.hadoop.hive.serde.JsonSerde'
STORED AS INPUTFORMAT 'com.esri.json.hadoop.UnenclosedJsonInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat';
3) Then I loaded the data into the table from he already existing enodeb table
SET hive.exec.dynamic.partition = true;
SET hive.exec.dynamic.partition.mode = nonstrict;
insert overwrite table enodebpartition partition (state) select * from
enodeb;
4) However I am getting the following error
Diagnostic Messages
for this Task: Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error
while processing row {
"state": "Punjab",
"circle": "PUNJAB",
"businessranking": "2",
"enodebstatus": "SCFT Samsung Acceptance Initiated",
"shape": ף0Kc#������ IA
}
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java: 172) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java: 54) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java: 453) at org.apache.hadoop.mapred.MapTask.run(MapTask.java: 343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java: 168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java: 422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java: 1657) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java: 162) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error
while processing row {
"state": "Punjab",
"circle": "PUNJAB",
"businessranking": "2",
"enodebstatus": "SCFT Samsung Acceptance Initiated",
"shape": ף0Kc#������ IA
}
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java: 545) at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java: 163)...8 more Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.objectinspector.SubStructObjectInspector cannot be cast to org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector at com.esri.hadoop.hive.serde.JsonSerde.serialize(Unknown Source) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java: 712) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java: 838) at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java: 88) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java: 838) at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java: 97) at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java: 164) at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java: 535)...9 more FAILED: Execution Error,
return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask MapReduce Jobs Launched: Stage - Stage - 1: Map: 5 HDFS Read: 0 HDFS Write: 0 FAIL Total MapReduce CPU Time Spent: 0 msec
5) In short I am getting an error as
Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.objectinspector.SubStructObjectInspector cannot be cast to org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector
I need help as I am not able to resolve this issue.
I am having a problem while loading data into a partitioned table
1) I want to load data from an existing table enodeb into a partitioned table enodebpartition. The schema for enode b is
2) I created the following partitioned table enodebpartition as shown
3) Then I loaded the data into the table from he already existing enodeb table
4) However I am getting the following error
5) In short I am getting an error as
I need help as I am not able to resolve this issue.
Thanks