Closed suolemen closed 10 years ago
commend changed : hadoop jar splout-hadoop-0.3.0-hadoop.jar simple-generate -libjars /usr/lib/hive/lib/hive-ant-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-beeline-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-cli-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-common-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-contrib-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-exec-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-hbase-handler-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-hwi-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-jdbc-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-metastore-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-serde-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-service-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-shims-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-shims-0.23-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-shims-common-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-shims-common-secure-0.12.0-cdh5.0.0.jar,/usr/lib/hive/lib/hive-testutils-0.12.0-cdh5.0.0.jar -it HIVE -hdb default -htn liyg_not_parquet -o out-hive-simple -pby cityname_name -p 2 -t liyg_not_parquet_of_me -tb liyg_not_parquet_sqlout
this problem solved!! ha-ha
but when I deploy commend: hadoop jar splout-hadoop-0.3.0-hadoop.jar deploy -root out-hive-simple -ts liyg_not_parquetsqlout -q http://192.*.*.*:4412 report an exception : INFO [pool-2-thread-1] qnode.Deployer (Deployer.java:run(114)) - 4412 Executing deploy for version [1411530526] 2014-09-24 11:48:47,260 ERROR [pool-2-thread-1] qnode.Deployer (Deployer.java:explainErrors(243)) - Deployment of version [1411530526] failed in DNode[192.168.50.6:4422] - it failed with the error [java.lang.IllegalArgumentException: Wrong FS: hdfs://***._.com:8020/user/root/out-hive-simple/liyg_not_parquet_sqlout/store/2.db, expected: file:/// at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:643) at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:79) at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:506) at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:724) at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:501) at org.apache.hadoop.fs.FileSystem.getContentSummary(FileSystem.java:1441) at org.apache.hadoop.fs.FilterFileSystem.getContentSummary(FilterFileSystem.java:379) at com.splout.db.dnode.Fetcher.sizeOf(Fetcher.java:357) at com.splout.db.dnode.DNodeHandler$2$1.run(DNodeHandler.java:516) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) ]
has solved!!
@suolemen ,Can I know how you resolve this problem?
hadoop cluster install used : cloudera-manager-installer.bin hadoop version: Hadoop 2.3.0-cdh5.0.0 splout version: splout-distribution-0.3.0
evn : export SPLOUT_HADOOP_COMMON_HOME=/usr/lib/hadoop export SPLOUT_HADOOP_HDFS_HOME=/usr/lib/hadoop-hdfs export SPLOUT_HADOOP_MAPRED_HOME=/usr/lib/hadoop-mapreduce export SPLOUT_HIVE_HOME=/usr/lib/hive export SPLOUT_HADOOP_CLASSPATH=$SPLOUT_HIVE_HOME/conf:$SPLOUT_HIVE_HOME/lib/ export HIVE_HOME=/usr/lib/hive export HADOOP_CLASSPATH=$HIVE_HOME/conf:$HIVE_HOME/lib/
commend: hadoop jar splout-hadoop-0.3.0-hadoop.jar simple-generate -it HIVE -hdb default -htn liyg_not_parquet -o out-hive-simple -pby cityname_name -p 2 -t liyg_not_parquet_of_me -tb liyg_not_parquet_sqlout
error:
hadoop.SimpleGeneratorCMD: Generating view with Hadoop... 14/09/23 16:57:27 INFO mapred.FileInputFormat: Total input paths to process : 1 14/09/23 16:57:27 INFO client.RMProxy: Connecting to ResourceManager at mcmaster.infobird.com/192.168.50.6:8032 14/09/23 16:57:28 INFO mapred.FileInputFormat: Total input paths to process : 1 14/09/23 16:57:28 INFO mapreduce.JobSubmitter: number of splits:1 14/09/23 16:57:28 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1411437225960_0036 14/09/23 16:57:29 INFO impl.YarnClientImpl: Submitted application application_1411437225960_0036 14/09/23 16:57:29 INFO mapreduce.Job: The url to track the job: http://mcmaster.*****.com:8088/proxy/application_1411437225960_0036/ 14/09/23 16:57:29 INFO mapreduce.Job: Running job: job_1411437225960_0036 14/09/23 16:57:43 INFO mapreduce.Job: Job job_1411437225960_0036 running in uber mode : false 14/09/23 16:57:43 INFO mapreduce.Job: map 0% reduce 0% 14/09/23 16:57:50 INFO mapreduce.Job: Task Id : attempt_1411437225960_0036_m_000000_0, Status : FAILED Error: java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.metadata.HiveStorageHandler at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:800) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) at java.net.URLClassLoader.access$100(URLClassLoader.java:71) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at org.apache.hcatalog.mapreduce.HCatSplit.readFields(HCatSplit.java:142) at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71) at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42) at com.datasalt.pangool.tuplemr.mapred.lib.input.TaggedInputSplit.readFields(TaggedInputSplit.java:109) at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71) at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42) at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:371) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:731) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
what problem ? hope help me! thanks