hbutani / SQLWindowing

SQL Windowing Functions for Hadoop
65 stars 17 forks source link

the temp table can not find #21

Closed penghao122 closed 12 years ago

penghao122 commented 12 years ago

hi , i tested below sql , it will raise below issue. how to fix it?

hive> wmode windowing; hive> from

<select starttime,lac ,ci ,callerno,count(1) callCount from ranap group by starttime,lac ,ci ,callerno limit 1000

partition by lac order by lac with rank() as r select starttime,lac,ci,callerno,callCount,r where <r <=3>; Executing Embedded Hive Query: Create table WindowingTempTable_1345105836343 as select starttime,lac ,ci ,callerno,count(1) callCount from ranap group by starttime,lac ,ci ,callerno limit 1000

Total MapReduce jobs = 2 Launching Job 1 out of 2 Number of reduce tasks not specified. Defaulting to jobconf value of: 8 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer= In order to limit the maximum number of reducers: set hive.exec.reducers.max= In order to set a constant number of reducers: set mapred.reduce.tasks= Cannot run job locally: Number of reducers (= 8) is more than 1 Starting Job = job_201208161534_0030, Tracking URL = http://server52:50030/jobdetails.jsp?jobid=job_201208161534_0030 Kill Command = /home/hadhive/hadoop-0.20.2/bin/hadoop job -Dmapred.job.tracker=server52:9001 -kill job_201208161534_0030 Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 8 2012-08-16 16:30:41,123 Stage-1 map = 0%, reduce = 0% 2012-08-16 16:30:47,145 Stage-1 map = 100%, reduce = 0% 2012-08-16 16:30:56,180 Stage-1 map = 100%, reduce = 13% 2012-08-16 16:30:57,186 Stage-1 map = 100%, reduce = 25% 2012-08-16 16:30:58,191 Stage-1 map = 100%, reduce = 38% 2012-08-16 16:30:59,197 Stage-1 map = 100%, reduce = 50% 2012-08-16 16:31:00,203 Stage-1 map = 100%, reduce = 63% 2012-08-16 16:31:01,213 Stage-1 map = 100%, reduce = 75% 2012-08-16 16:31:05,230 Stage-1 map = 100%, reduce = 88% 2012-08-16 16:31:06,236 Stage-1 map = 100%, reduce = 100% Ended Job = job_201208161534_0030 Launching Job 2 out of 2 Number of reduce tasks determined at compile time: 1 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer= In order to limit the maximum number of reducers: set hive.exec.reducers.max= In order to set a constant number of reducers: set mapred.reduce.tasks= Cannot run job locally: Number of Input Files (= 8) is larger than hive.exec.mode.local.auto.input.files.max(= 4) Starting Job = job_201208161534_0031, Tracking URL = http://server52:50030/jobdetails.jsp?jobid=job_201208161534_0031 Kill Command = /home/hadhive/hadoop-0.20.2/bin/hadoop job -Dmapred.job.tracker=server52:9001 -kill job_201208161534_0031 Hadoop job information for Stage-2: number of mappers: 3; number of reducers: 1 2012-08-16 16:31:14,011 Stage-2 map = 0%, reduce = 0% 2012-08-16 16:31:17,022 Stage-2 map = 67%, reduce = 0% 2012-08-16 16:31:18,027 Stage-2 map = 100%, reduce = 0% 2012-08-16 16:31:26,057 Stage-2 map = 100%, reduce = 100% Ended Job = job_201208161534_0031 Moving data to: hdfs://server52:9000/smp/hive/warehouse/test_js.db/windowingtemptable_1345105836343 Table test_js.windowingtemptable_1345105836343 stats: [num_partitions: 0, num_files: 1, num_rows: 0, total_size: 33224, raw_data_size: 0] 1000 Rows loaded to hdfs://server52:9000/tmp/hive-hadhive/hive_2012-08-16_16-30-36_344_8638175632607957910/-ext-10000 MapReduce Jobs Launched: Job 0: Map: 1 Reduce: 8 HDFS Read: 37276978 HDFS Write: 348674 SUCCESS Job 1: Map: 3 Reduce: 1 HDFS Read: 350178 HDFS Write: 33224 SUCCESS Total MapReduce CPU Time Spent: 0 msec OK Time taken: 53.257 seconds Finished Executing Embedded Hive Query

com.sap.hadoop.windowing.WindowingException: com.sap.hadoop.windowing.WindowingException: NoSuchObjectException(message:big.WindowingTempTable_1345105836343 table not found) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:77) at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:102) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:190) at com.sap.hadoop.windowing.runtime.mr.MRTranslator.setupWindowingInput(MRTranslator.groovy:45) at com.sap.hadoop.windowing.runtime.mr.MRTranslator$setupWindowingInput.callCurrent(Unknown Source) at com.sap.hadoop.windowing.query.Translator.setupQueryInput(Translator.groovy:92) at com.sap.hadoop.windowing.query.Translator$setupQueryInput.callCurrent(Unknown Source) at com.sap.hadoop.windowing.query.Translator.translate(Translator.groovy:48) at com.sap.hadoop.windowing.query.Translator$translate.call(Unknown Source) at com.sap.hadoop.windowing.runtime.WindowingShell.execute(WindowingShell.groovy:127) at com.sap.hadoop.windowing.runtime.WindowingShell$execute.call(Unknown Source) at com.sap.hadoop.windowing.cli.WindowingClient3.executeQuery(WindowingClient3.groovy:28) at com.sap.hadoop.windowing.cli.WindowingClient3$executeQuery.call(Unknown Source) at com.sap.hadoop.windowing.WindowingHiveCliDriver.processCmd(WindowingHiveCliDriver.groovy:117) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:341) at org.apache.hadoop.hive.cli.CliDriver$processLine.call(Unknown Source) at com.sap.hadoop.windowing.WindowingHiveCliDriver.main(WindowingHiveCliDriver.groovy:235) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:156) Caused by: com.sap.hadoop.windowing.WindowingException: NoSuchObjectException(message:big.WindowingTempTable_1345105836343 table not found) at com.sap.hadoop.HiveUtils.getTable(HiveUtils.java:190) at com.sap.hadoop.HiveUtils.getTable(HiveUtils.java:198) at com.sap.hadoop.HiveUtils$getTable.call(Unknown Source) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42) at com.sap.hadoop.HiveUtils$getTable.call(Unknown Source) at com.sap.hadoop.windowing.runtime.mr.MRTranslator.setupWindowingInput(MRTranslator.groovy:35) ... 19 more Caused by: NoSuchObjectException(message:big.WindowingTempTable_1345105836343 table not found) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:976) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:735) at com.sap.hadoop.HiveUtils.getTable(HiveUtils.java:186) ... 24 more Failed windowing query from <select starttime,lac ,ci ,callerno,count(1) callCount from ranap group by starttime,lac ,ci ,callerno limit 1000

partition by lac order by lac with rank() as r select starttime,lac,ci,callerno,callCount,r where <r <=3> com.sap.hadoop.windowing.WindowingException: NoSuchObjectException(message:big.WindowingTempTable_1345105836343 table not found)

hbutani commented 12 years ago

Looks like the Temp Table for the embedded Hive Query got created in the default DB, whereas you are using the 'big' db in your session. This looks like a bug; will look into it in a couple of days. Can you confirm by trying your example in the default db.

penghao122 commented 12 years ago

not default db , i create a database like test db and then use test . during my session, i was not change db , so i think it like a bug

penghao122 commented 12 years ago

Looks like the Temp Table for the embedded Hive Query got created in the default DB,

not default DB, the temp table created in my test db. 'cause i type use test fro the sql windowing function test.

whereas you are using the 'big' db in your session. This looks like a bug; will look into it in a couple of days. Can you confirm by trying your example in the default db.

i don't know why useing the big db. i have no any command to use big db .so? any concern, pls tell me know thx

penghao122 commented 12 years ago

i try to use show databases command ,maybe , the defaulut use big db? it is my assumption

hive> show databases; OK big default hqg hsq_mc liwei overclock ph test test_js xhh xhh50w xhh_big xhh_js Time taken: 0.051 seconds hive>

hbutani commented 12 years ago

Ok I know the bug you are running into. This is actually fixed in code; the fix is not in the download version. Can you do a build? If not I can email you the latest jar.

penghao122 commented 12 years ago

thank u reply , i have no envrionment to build the code.so .can u send the latest jar to me? my mail:penghao122@gmail.com .thx.

hbutani commented 12 years ago

Its attached.

regards, Harish.

On Thu, Aug 16, 2012 at 9:51 PM, penghao122 notifications@github.comwrote:

thank u reply , i have no envrionment to build the code.so .can u send the latest jar to me? my mail:penghao122@gmail.com .thx.

— Reply to this email directly or view it on GitHubhttps://github.com/hbutani/SQLWindowing/issues/21#issuecomment-7808415.

penghao122 commented 12 years ago

the latest jar can fix the above issue.thx harish the sql must specify the path

if u have requirement to insert a table fllow below sql

from part partition by p_mfgr order by p_mfgr, p_name desc with rank() as r select p_partkey, p_mfgr,p_name, p_size, r into path='/tmp/wout2' load overwrite into table part_win