spring-attic / spring-hadoop-samples

Spring Hadoop Samples
Apache License 2.0
492 stars 466 forks source link

I need a Hive 13, HiverServer2 sample #18

Closed sagpid closed 10 years ago

sagpid commented 10 years ago

I am currently working with a server running HiveServer2 and Hive 13. I tried modifying the samples to use Hive 13. I executed sh ./target/appassembler/bin/hiveBatchApp

I get the following Exception. How should I get around this? Alernatively, if you have a sample that works on SpringBatch/Hive 13/HiverServer2 please send the link for that. Thanks for your help

localSourceFile = /home/saga/Downloads/spring-hadoop-samples/hive-batch/target/appassembler/data/nbatweets-small.txt
inputDir = /tweets/input
about to execute the file copying
exiting
23:24:10,564  INFO amework.samples.hadoop.hive.HiveBatchApp:  37 - Batch Tweet Influencers Hive Job Running
23:24:10,667  INFO ch.core.launch.support.SimpleJobLauncher: 133 - Job: [FlowJob: [name=hiveJob]] launched with the following parameters: [{}]
23:24:10,727  INFO amework.batch.core.job.SimpleStepHandler: 146 - Executing step: [influencer-step]
23:24:10,812 ERROR ngframework.batch.core.step.AbstractStep: 225 - Encountered an error executing step influencer-step in job hiveJob
org.springframework.dao.DataAccessResourceFailureException: Invalid method name: 'execute'; nested exception is org.apache.thrift.TApplicationException: Invalid method name: 'execute'
    at org.springframework.data.hadoop.hive.HiveUtils.convert(HiveUtils.java:69)
    at org.springframework.data.hadoop.hive.HiveTemplate.convertHiveAccessException(HiveTemplate.java:99)
    at org.springframework.data.hadoop.hive.HiveTemplate.execute(HiveTemplate.java:82)
    at org.springframework.data.hadoop.hive.HiveTemplate.executeScript(HiveTemplate.java:261)
sagpid commented 10 years ago

I use hadoop 2.4.1 from Hortonworks. Spring 4.0.6.

I have the following properties.

hive.exec.drop.ignorenonexistent=true
hive.host=aa.bb.cc.dd
hive.port=10000
# hive.url=jdbc:hive2://${hive.host}:${hive.port}/default;auth=noSasl
hive.table=passwords