datasalt / splout-db

A web-latency SQL spout for Hadoop.
50 stars 14 forks source link

Exception in thread "main" com.splout.db.hadoop.TablespaceGenerator$TablespaceGeneratorException: Error executing generation Job #40

Closed suolemen closed 10 years ago

suolemen commented 10 years ago

attempt_1409329200275_1810_r_000001_1, Status : FAILED Container [pid=6989,containerID=container_1409329200275_1810_01_000097] is running beyond physical memory limits. Current usage: 1.0 GB of 1 GB physical memory used; 1.6 GB of 2.1 GB virtual memory used. Killing container. Dump of the process-tree for container_1409329200275_1810_01_000097 : |- PID PPID PGRPID SESSID CMD_NAME USER_MODE_TIME(MILLIS) SYSTEM_TIME(MILLIS) VMEM_USAGE(BYTES) RSSMEM_USAGE(PAGES) FULL_CMD_LINE |- 6989 9703 6989 6989 (bash) 1 1 108605440 330 /bin/bash -c /usr/java/jdk1.7.0_45-cloudera/bin/java -Djava.net.preferIPv4Stack=true -Dhadoop.metrics.log.level=WARN -Djava.net.preferIPv4Stack=true -Xmx825955249 -Djava.io.tmpdir=/data/yarn/nm/usercache/root/appcache/application_1409329200275_1810/container_1409329200275_1810_01_000097/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/data/var/log/hadoop-yarn/container/application_1409329200275_1810/container_1409329200275_1810_01_000097 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA org.apache.hadoop.mapred.YarnChild 192.168.50.53 41943 attempt_1409329200275_1810_r_000001_1 97 1>/data/var/log/hadoop-yarn/container/application_1409329200275_1810/container_1409329200275_1810_01_000097/stdout 2>/data/var/log/hadoop-yarn/container/application_1409329200275_1810/container_1409329200275_1810_01_000097/stderr |- 6999 6989 6989 6989 (java) 21615 2976 1656119296 264691 /usr/java/jdk1.7.0_45-cloudera/bin/java -Djava.net.preferIPv4Stack=true -Dhadoop.metrics.log.level=WARN -Djava.net.preferIPv4Stack=true -Xmx825955249 -Djava.io.tmpdir=/data/yarn/nm/usercache/root/appcache/application_1409329200275_1810/container_1409329200275_1810_01_000097/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/data/var/log/hadoop-yarn/container/application_1409329200275_1810/container_1409329200275_1810_01_000097 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA org.apache.hadoop.mapred.YarnChild 192.168.50.53 41943 attempt_1409329200275_1810_r_000001_1 97

Exception in thread "main" com.splout.db.hadoop.TablespaceGenerator$TablespaceGeneratorException: Error executing generation Job at com.splout.db.hadoop.TablespaceGenerator.executeViewGeneration(TablespaceGenerator.java:482) at com.splout.db.hadoop.TablespaceGenerator.generateView(TablespaceGenerator.java:143) at com.splout.db.hadoop.SimpleGeneratorCMD.run(SimpleGeneratorCMD.java:253) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at com.splout.db.hadoop.SimpleGeneratorCMD.main(SimpleGeneratorCMD.java:261) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at com.datasalt.pangool.PangoolDriver$ProgramDescription.invoke(PangoolDriver.java:55) at com.datasalt.pangool.PangoolDriver.driver(PangoolDriver.java:128) at com.splout.db.hadoop.Driver.main(Driver.java:49) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

suolemen commented 10 years ago

command: hadoop jar splout-hadoop-0.2.5-hadoop-mr2.jar simple-generate -it HIVE -hdb default -htn inventory_filter_yes_key -o out-hive-simple -pby areacode -p 2 -t inventory_filter_yes_key_of_me -tb hive_simple_example

inventory_filter_yes_key size : 10 G! please ,help me !

suolemen commented 10 years ago

have sloved !! ha-ha !! commend changed :hadoop jar splout-hadoop-0.2.5-hadoop-mr2.jar simple-generate -it HIVE -hdb default -htn inventory_filter_yes_key -o out-hive-simple -pby areacode -p 90 -t inventory_filter_yes_key_of_me -tb hive_simple_example

so that , 2 reduce become 90 reduce!!

ivanprado commented 10 years ago

The problem is with the containers assigned memory:

_Container [pid=6989,containerID=container_1409329200275_1810_01000097] is running beyond physical memory limits. Current usage: 1.0 GB of 1 GB physical memory used; 1.6 GB of 2.1 GB virtual memory used. Killing container.

You would probably need to assign more physical an virtual memory to the containers used for Splout.

Regards.

2014-09-11 7:35 GMT+02:00 suolemen notifications@github.com:

Closed #40 https://github.com/datasalt/splout-db/issues/40.

— Reply to this email directly or view it on GitHub https://github.com/datasalt/splout-db/issues/40#event-164041060.

Iván de Prado CEO & Co-founder www.datasalt.com