spring-attic / spring-hadoop-samples

Spring Hadoop Samples
Apache License 2.0
492 stars 466 forks source link

Exceptions for Running HiveApp in Mac #26

Open xhe opened 9 years ago

xhe commented 9 years ago

I started my local HDFS and then tried to run HiveApp, the "show tables" commands can be run successfully, but when running HSQL, exceptions happened, this is the output in the console:

2015-07-26 21:56:51.744 java[5042:457346] Unable to load realm mapping info from SCDynamicStore OK [grpshell, passwords]OK OK OK Copying data from file:/etc/passwd Copying file: file:/etc/passwd Loading data to table default.passwords Table default.passwords stats: [numFiles=1, numRows=0, totalSize=5581, rawDataSize=0] OK OK Total jobs = 1 Launching Job 1 out of 1 Number of reduce tasks not specified. Estimated from input data size: 1 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer= In order to limit the maximum number of reducers: set hive.exec.reducers.max= In order to set a constant number of reducers: set mapreduce.job.reduces= org.apache.hive.com.esotericsoftware.kryo.KryoException: java.lang.IllegalArgumentException: Unable to create serializer "org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer" for class: org.apache.hadoop.hive.ql.exec.FileSinkOperator Serialization trace: childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator) childOperators (org.apache.hadoop.hive.ql.exec.GroupByOperator) reducer (org.apache.hadoop.hive.ql.plan.ReduceWork) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.write(ObjectField.java:82) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:474) at org.apache.hive.com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:614) at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.write(CollectionSerializer.java:78) at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.write(CollectionSerializer.java:18) at org.apache.hive.com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:538) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.write(ObjectField.java:61) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:474) at org.apache.hive.com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:538) at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.write(ObjectField.java:61) at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:474) at org.apache.hive.com.esotericsoftware.kryo.Kryo.writeObject(Kryo.java:520) at org.apache.hadoop.hive.ql.exec.Utilities.serializeObjectByKryo(Utilities.java:895) at org.apache.hadoop.hive.ql.exec.Utilities.serializePlan(Utilities.java:799) at org.apache.hadoop.hive.ql.exec.Utilities.serializePlan(Utilities.java:811) at org.apache.hadoop.hive.ql.exec.Utilities.setBaseWork(Utilities.java:601) at org.apache.hadoop.hive.ql.exec.Utilities.setReduceWork(Utilities.java:578) at org.apache.hadoop.hive.ql.exec.Utilities.setMapRedWork(Utilities.java:569) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:372) at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901) at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:198) at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:644) at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:628) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918) at java.lang.Thread.run(Thread.java:695) java.lang.OutOfMemoryError: PermGen space at java.lang.Throwable.getStackTraceElement(Native Method) at java.lang.Throwable.getOurStackTrace(Throwable.java:591) at java.lang.Throwable.printStackTraceAsCause(Throwable.java:481) at java.lang.Throwable.printStackTrace(Throwable.java:468) at java.lang.Throwable.printStackTrace(Throwable.java:451) at org.apache.hadoop.hive.ql.exec.Utilities.setBaseWork(Utilities.java:626) at org.apache.hadoop.hive.ql.exec.Utilities.setReduceWork(Utilities.java:578) at org.apache.hadoop.hive.ql.exec.Utilities.setMapRedWork(Utilities.java:569) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:372) at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1503) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1270) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1088) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:901) at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:198) at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:644) at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:628) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918) at java.lang.Thread.run(Thread.java:695) FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. PermGen space Exception in thread "org.apache.hadoop.hdfs.PeerCache@4db323af" java.lang.OutOfMemoryError: PermGen space Exception in thread "main" java.lang.OutOfMemoryError: PermGen space Exception in thread "LeaseRenewer:frankhe@localhost:9000" java.lang.OutOfMemoryError: PermGen space

The config is as follows:

hd.fs=hdfs://localhost:9000 hd.rm=localhost:50070 hd.jh=localhost:8088

hive.host=localhost hive.port=10000 hive.url=jdbc:hive://${hive.host}:${hive.port} hive.table=passwords

My OS is Mac, I already tried to udpate STS.ini to:

-Xms128m -Xmx768m -XX:MaxPermSize=4096m

the exception is the same.

Any idea on how to fix the exception?

Thanks