Open aftnix opened 8 years ago
Turns out my table didn't have id as the first field. I fixed it. But now the INSERT never finishes( I waited couple of hours, reduced dataset etc, but the query never finishes.
Yarn logs contain these :
2016-09-26 17:24:21,082 [INFO] [Dispatcher thread {Central}] |history.HistoryEventHandler|: [HISTORY][DAG:dag_1474881768573_0002_2][Event:VERTEX_
FINISHED]: vertexName=Map 1, vertexId=vertex_1474881768573_0002_2_00, initRequestedTime=1474883498501, initedTime=1474883499061, startRequestedTi
me=1474883498577, startedTime=1474883499061, finishTime=1474889061031, timeTaken=5561970, status=KILLED, diagnostics=Vertex received Kill while i
n RUNNING state.
Vertex did not succeed due to DAG_KILL, failedTasks:0 killedTasks:3
Vertex vertex_1474881768573_0002_2_00 [Map 1] killed/failed due to:DAG_KILL, counters=Counters: 0, vertexStats=firstTaskStartTime=1474883503313,
firstTasksToStart=[ task_1474881768573_0002_2_00_000001 ], lastTaskFinishTime=1474889061030, lastTasksToFinish=[ task_1474881768573_0002_2_00_000
002,task_1474881768573_0002_2_00_000001 ], minTaskDuration=-1, maxTaskDuration=-1, avgTaskDuration=-1.0, numSuccessfulTasks=0, shortestDurationTa
sks=[ ], longestDurationTasks=[ ], vertexTaskStats={numFailedTaskAttempts=0, numKilledTaskAttempts=0, numCompletedTasks=3, numSucceededTasks=0,
numKilledTasks=3, numFailedTasks=0}
Don't know what's going wrong here :(
Sorry for the delay of a few days to get back to you.
Are there any errors besides those messages?
Can you also share a little bit about your environment - it seems you're using Tez? What version/distro of Hive?
I am able to load data into solr external table from another managed hive table. But when I try to retrieve data from the solr table, it is throwing "Failed with exception java.io.IOException:java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.String" I am using solr-hive-serde-2.2.6.jar on Hive 1.1.0-cdh5.4.5
@vishnucg can you please open a new issue with your question?
Did this issue get resolved? I'm getting the same error
I'm getting
Caused by: java.lang.NullPointerException at com.lucidworks.hadoop.io.impl.LWSolrDocument.getId(LWSolrDocument.java:46) at com.lucidworks.hadoop.io.LucidWorksWriter.write(LucidWorksWriter.java:190) ... 22 more ], TaskAttempt 3 failed, info=[Error: Failure while running task:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) {"key":{},"value":{"_col0":null,"_col1":null,"_col2":null,"_col3":null,"_col4":null,"_col5":null,"_col6":null,"_col7":null,"_col8":null,"_col9":null,"_col10":null,"_col11":null,"_col12":null,"_col13":null,"_col14":null,"_col15":null,"_col16
The hive table and the hive_solr table have the exactly same schema.