cloudera / hue

Open source SQL Query Assistant service for Databases/Warehouses
https://cloudera.com
Apache License 2.0
1.14k stars 365 forks source link

Hue hangs when Hive/HDFS are accessed through the hue-web-ui #29

Closed rameshrk closed 11 years ago

rameshrk commented 11 years ago

Hi,

Firstly, I would like to acknowledge your work !!! I was trying to install Hue in Mac OS X 10.7.4. (I hope Hue can work outside Cloudera VMs as well). Had sailed through the building of hue, fixing of mysql errrors, hue user/group configuration, changed the hue.ini to point to my HADOOP_HOME/HIVE_HOME but Oozie is working. I was getting these following errors. Let me know if I made a mistake and a ton of thanks in advance.

13/05/19 06:10:00 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS, InheritanceStrategy : new-table] 13/05/19 06:10:00 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MDatabase.parameters [Table : DATABASE_PARAMS] 13/05/19 06:10:00 INFO Datastore.Schema: Validating 2 index(es) for table DBS 13/05/19 06:10:00 INFO Datastore.Schema: Validating 0 foreign key(s) for table DBS 13/05/19 06:10:00 INFO Datastore.Schema: Validating 2 unique key(s) for table DBS 13/05/19 06:10:00 INFO Datastore.Schema: Validating 2 index(es) for table DATABASE_PARAMS 13/05/19 06:10:00 INFO Datastore.Schema: Validating 1 foreign key(s) for table DATABASE_PARAMS 13/05/19 06:10:00 INFO Datastore.Schema: Validating 1 unique key(s) for table DATABASE_PARAMS 13/05/19 06:10:01 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MDatabase 13/05/19 06:10:01 INFO beeswax.Server: Started new Beeswax Thrift metaserver on port [8003]... 13/05/19 06:10:01 INFO beeswax.Server: minWorkerThreads = 5 13/05/19 06:10:01 INFO beeswax.Server: maxWorkerThreads = 2147483647 13/05/19 06:17:49 INFO metastore.HiveMetaStore: 1: get_all_databases 13/05/19 06:17:49 INFO metastore.HiveMetaStore: 1: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 13/05/19 06:17:49 INFO metastore.ObjectStore: ObjectStore, initialize called 13/05/19 06:17:49 INFO metastore.ObjectStore: Initialized ObjectStore Exception in thread "pool-1-thread-1" java.lang.NoClassDefFoundError: org/apache/thrift/scheme/StandardScheme at com.cloudera.beeswax.api.BeeswaxService$get_default_configuration_args.(BeeswaxService.java:8805) at com.cloudera.beeswax.api.BeeswaxService$Processor$get_default_configuration.getEmptyArgsInstance(BeeswaxService.java:1115) at com.cloudera.beeswax.api.BeeswaxService$Processor$get_default_configuration.getEmptyArgsInstance(BeeswaxService.java:1109) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:19) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:680) Caused by: java.lang.ClassNotFoundException: org.apache.thrift.scheme.StandardScheme at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) ... 9 more 13/05/19 06:20:11 INFO metastore.HiveMetaStore: 1: get_all_databases Exception in thread "pool-1-thread-2" java.lang.NoClassDefFoundError: Could not initialize class com.cloudera.beeswax.api.BeeswaxService$get_default_configuration_args at com.cloudera.beeswax.api.BeeswaxService$Processor$get_default_configuration.getEmptyArgsInstance(BeeswaxService.java:1115) at com.cloudera.beeswax.api.BeeswaxService$Processor$get_default_configuration.getEmptyArgsInstance(BeeswaxService.java:1109) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:19) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:680) 13/05/19 06:22:11 INFO metastore.HiveMetaStore: 1: get_all_databases Exception in thread "pool-1-thread-3" java.lang.NoClassDefFoundError: org/apache/thrift/scheme/StandardScheme at com.cloudera.beeswax.api.BeeswaxService$query_args.(BeeswaxService.java:1184) at com.cloudera.beeswax.api.BeeswaxService$Processor$query.getEmptyArgsInstance(BeeswaxService.java:905) at com.cloudera.beeswax.api.BeeswaxService$Processor$query.getEmptyArgsInstance(BeeswaxService.java:899) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:19) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:680) Caused by: java.lang.ClassNotFoundException: org.apache.thrift.scheme.StandardScheme at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) ... 9 more

rameshrk commented 11 years ago

Adding the whole stack-trace here before it vanishes. BTW, I read that you had said somewhere like: To sum-up: Hue 2.1 works with Hive 0.9 Hue 2.2 works with Hive 0.10

I use Hue 2.3.0 with Hive 0.9 as I took both the stable versions.

L-IDC42MDV7M-M:hue-2.3.0 hadoop$ sudo /usr/share/hue/build/env/bin/supervisor $HADOOP_HOME= $HADOOP_BIN=/Users/hadoop/Works/hadoop-1.0.4/bin/hadoop $HIVE_CONF_DIR=/Users/hadoop/Works/hive-0.9.0/conf $HIVEHOME=/Users/hadoop/Works/hive-0.9.0 find: illegal option -- n find: illegal option -- a find: illegal option -- m find: illegal option -- e find: hue-plugins.jar: No such file or directory $HADOOP_CLASSPATH=:/Users/hadoop/Works/hive-0.9.0/lib/antlr-runtime-3.0.1.jar:/Users/hadoop/Works/hive-0.9.0/lib/commons-cli-1.2.jar:/Users/hadoop/Works/hive-0.9.0/lib/commons-codec-1.3.jar:/Users/hadoop/Works/hive-0.9.0/lib/commons-collections-3.2.1.jar:/Users/hadoop/Works/hive-0.9.0/lib/commons-dbcp-1.4.jar:/Users/hadoop/Works/hive-0.9.0/lib/commons-lang-2.4.jar:/Users/hadoop/Works/hive-0.9.0/lib/commons-logging-1.0.4.jar:/Users/hadoop/Works/hive-0.9.0/lib/commons-logging-api-1.0.4.jar:/Users/hadoop/Works/hive-0.9.0/lib/commons-pool-1.5.4.jar:/Users/hadoop/Works/hive-0.9.0/lib/datanucleus-connectionpool-2.0.3.jar:/Users/hadoop/Works/hive-0.9.0/lib/datanucleus-core-2.0.3.jar:/Users/hadoop/Works/hive-0.9.0/lib/datanucleus-enhancer-2.0.3.jar:/Users/hadoop/Works/hive-0.9.0/lib/datanucleus-rdbms-2.0.3.jar:/Users/hadoop/Works/hive-0.9.0/lib/derby-10.4.2.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/guava-r09.jar:/Users/hadoop/Works/hive-0.9.0/lib/hbase-0.92.0-tests.jar:/Users/hadoop/Works/hive-0.9.0/lib/hbase-0.92.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/hive-builtins-0.9.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/hive-cli-0.9.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/hive-common-0.9.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/hive-contrib-0.9.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/hive-exec-0.9.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/hive-hbase-handler-0.9.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/hive-hwi-0.9.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/hive-jdbc-0.9.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/hive-metastore-0.9.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/hive-pdk-0.9.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/hive-serde-0.9.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/hive-service-0.9.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/hive-shims-0.9.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/hive_contrib.jar:/Users/hadoop/Works/hive-0.9.0/lib/jackson-core-asl-1.8.8.jar:/Users/hadoop/Works/hive-0.9.0/lib/jackson-jaxrs-1.8.8.jar:/Users/hadoop/Works/hive-0.9.0/lib/jackson-mapper-asl-1.8.8.jar:/Users/hadoop/Works/hive-0.9.0/lib/jackson-xc-1.8.8.jar:/Users/hadoop/Works/hive-0.9.0/lib/JavaEWAH-0.3.2.jar:/Users/hadoop/Works/hive-0.9.0/lib/jdo2-api-2.3-ec.jar:/Users/hadoop/Works/hive-0.9.0/lib/jline-0.9.94.jar:/Users/hadoop/Works/hive-0.9.0/lib/json-20090211.jar:/Users/hadoop/Works/hive-0.9.0/lib/libfb303-0.7.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/libfb303.jar:/Users/hadoop/Works/hive-0.9.0/lib/libthrift-0.7.0.jar:/Users/hadoop/Works/hive-0.9.0/lib/libthrift.jar:/Users/hadoop/Works/hive-0.9.0/lib/log4j-1.2.16.jar:/Users/hadoop/Works/hive-0.9.0/lib/mysql-connector-java-5.1.25-bin.jar:/Users/hadoop/Works/hive-0.9.0/lib/slf4j-api-1.6.1.jar:/Users/hadoop/Works/hive-0.9.0/lib/slf4j-log4j12-1.6.1.jar:/Users/hadoop/Works/hive-0.9.0/lib/stringtemplate-3.1-b1.jar:/Users/hadoop/Works/hive-0.9.0/lib/zookeeper-3.4.3.jar: $HADOOP_OPTS=-Dlog4j.configuration=log4j.properties $HADOOP_CONF_DIR=/Users/hadoop/Works/hive-0.9.0/conf:/Users/hadoop/Works/hadoop-1.0.4/conf $HADOOP_MAPREDHOME=/Users/hadoop/Works/hadoop-1.0.4 CWD=/Users/hadoop/Works/hue-2.3.0 Executing /Users/hadoop/Works/hadoop-1.0.4/bin/hadoop jar /usr/share/hue/apps/beeswax/src/beeswax/../../java-lib/BeeswaxServer.jar --beeswax 8002 --desktop-host 127.0.0.1 --desktop-port 8888 --query-lifetime 604800000 --metastore 8003 (58958) ** Controller starting at Sun May 19 06:09:47 2013 Should start 1 new children Controller.spawn_children(number=1) 13/05/19 06:09:56 INFO beeswax.Server: Starting metastore at port 8003 13/05/19 06:09:56 INFO beeswax.Server: Starting beeswaxd at port 8002 13/05/19 06:09:56 INFO beeswax.Server: Parsed core-default.xml sucessfully. Learned 55 descriptions. 13/05/19 06:09:56 INFO beeswax.Server: Parsed hdfs-default.xml sucessfully. Learned 51 descriptions. 13/05/19 06:09:56 INFO beeswax.Server: Parsed mapred-default.xml sucessfully. Learned 121 descriptions. 13/05/19 06:09:56 WARN beeswax.Server: Could not parse or find: hive-default.xml. Learned 0 description, this is not a problem. 13/05/19 06:09:56 INFO beeswax.Server: Starting beeswax server on port 8002, talking back to Desktop at 127.0.0.1:8888 13/05/19 06:09:56 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 13/05/19 06:09:56 INFO metastore.ObjectStore: ObjectStore, initialize called 13/05/19 06:09:57 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved. 13/05/19 06:09:57 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved. 13/05/19 06:09:57 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved. 13/05/19 06:09:57 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored 13/05/19 06:09:57 INFO DataNucleus.Persistence: Property javax.jdo.option.NonTransactionalRead unknown - will be ignored 13/05/19 06:09:57 INFO DataNucleus.Persistence: ================= Persistence Configuration =============== 13/05/19 06:09:57 INFO DataNucleus.Persistence: DataNucleus Persistence Factory - Vendor: "DataNucleus" Version: "2.0.3" 13/05/19 06:09:57 INFO DataNucleus.Persistence: DataNucleus Persistence Factory initialised for datastore URL="jdbc:mysql://localhost/metastore_db?createDatabaseIfNotExist=true" driver="com.mysql.jdbc.Driver" userName="hadoop" 13/05/19 06:09:57 INFO DataNucleus.Persistence: =========================================================== 13/05/19 06:09:58 INFO Datastore.Schema: Creating table DELETEME1368968998709 13/05/19 06:09:59 INFO Datastore.Schema: Schema Name could not be determined for this datastore 13/05/19 06:09:59 INFO Datastore.Schema: Dropping table DELETEME1368968998709 13/05/19 06:09:59 INFO Datastore.Schema: Initialising Catalog "metastore_db", Schema "" using "None" auto-start option 13/05/19 06:09:59 INFO Datastore.Schema: Catalog "metastoredb", Schema "" initialised - managing 0 classes 13/05/19 06:09:59 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 13/05/19 06:09:59 INFO DataNucleus.MetaData: Registering listener for metadata initialisation 13/05/19 06:09:59 INFO metastore.ObjectStore: Initialized ObjectStore 13/05/19 06:10:00 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/Users/hadoop/Works/hive-0.9.0/lib/hive-metastore-0.9.0.jar!/package.jdo" at line 28, column 6 : cvc-elt.1: Cannot find the declaration of element 'jdo'. - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 13/05/19 06:10:00 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/Users/hadoop/Works/hive-0.9.0/lib/hive-metastore-0.9.0.jar!/package.jdo" at line 338, column 13 : The content of element type "class" must match "(extension,implements,datastore-identity?,primary-key?,inheritance?,version?,join,foreign-key,index,unique,column,field,property,query,fetch-group,extension)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 13/05/19 06:10:00 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/Users/hadoop/Works/hive-0.9.0/lib/hive-metastore-0.9.0.jar!/package.jdo" at line 385, column 13 : The content of element type "class" must match "(extension,implements,datastore-identity?,primary-key?,inheritance?,version?,join,foreign-key,index,unique,column,field,property,query,fetch-group,extension)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 13/05/19 06:10:00 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/Users/hadoop/Works/hive-0.9.0/lib/hive-metastore-0.9.0.jar!/package.jdo" at line 407, column 13 : The content of element type "class" must match "(extension,implements,datastore-identity?,primary-key?,inheritance?,version?,join,foreign-key,index,unique,column,field,property,query,fetch-group,extension)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 13/05/19 06:10:00 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/Users/hadoop/Works/hive-0.9.0/lib/hive-metastore-0.9.0.jar!/package.jdo" at line 442, column 13 : The content of element type "class" must match "(extension,implements,datastore-identity?,primary-key?,inheritance?,version?,join,foreign-key,index,unique,column,field,property,query,fetch-group,extension)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 13/05/19 06:10:00 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/Users/hadoop/Works/hive-0.9.0/lib/hive-metastore-0.9.0.jar!/package.jdo" at line 479, column 13 : The content of element type "class" must match "(extension,implements,datastore-identity?,primary-key?,inheritance?,version?,join,foreign-key,index,unique,column,field,property,query,fetch-group,extension)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 13/05/19 06:10:00 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/Users/hadoop/Works/hive-0.9.0/lib/hive-metastore-0.9.0.jar!/package.jdo" at line 520, column 13 : The content of element type "class" must match "(extension,implements,datastore-identity?,primary-key?,inheritance?,version?,join,foreign-key,index,unique,column,field,property,query,fetch-group,extension)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 13/05/19 06:10:00 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/Users/hadoop/Works/hive-0.9.0/lib/hive-metastore-0.9.0.jar!/package.jdo" at line 561, column 13 : The content of element type "class" must match "(extension,implements,datastore-identity?,primary-key?,inheritance?,version?,join,foreign-key,index,unique,column,field,property,query,fetch-group,extension)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 13/05/19 06:10:00 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/Users/hadoop/Works/hive-0.9.0/lib/hive-metastore-0.9.0.jar!/package.jdo" at line 602, column 13 : The content of element type "class" must match "(extension,implements,datastore-identity?,primary-key?,inheritance?,version?,join,foreign-key,index,unique,column,field,property,query,fetch-group,extension)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 13/05/19 06:10:00 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/Users/hadoop/Works/hive-0.9.0/lib/hive-metastore-0.9.0.jar!/package.jdo" at line 647, column 13 : The content of element type "class" must match "(extension,implements,datastore-identity?,primary-key?,inheritance?,version?,join,foreign-key,index,unique,column,field,property,query,fetch-group,extension)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 13/05/19 06:10:00 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/Users/hadoop/Works/hive-0.9.0/lib/hive-metastore-0.9.0.jar!/package.jdo" at line 692, column 13 : The content of element type "class" must match "(extension,implements,datastore-identity?,primary-key?,inheritance?,version?,join,foreign-key,index,unique,column,field,property,query,fetch-group,extension)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 13/05/19 06:10:00 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/Users/hadoop/Works/hive-0.9.0/lib/hive-metastore-0.9.0.jar!/package.jdo" at line 720, column 13 : The content of element type "class" must match "(extension,implements,datastore-identity?,primary-key?,inheritance?,version?,join,foreign-key,index,unique,column,field,property,query,fetch-group,extension_)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified. 13/05/19 06:10:00 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS, InheritanceStrategy : new-table] 13/05/19 06:10:00 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MDatabase.parameters [Table : DATABASE_PARAMS] 13/05/19 06:10:00 INFO Datastore.Schema: Validating 2 index(es) for table DBS 13/05/19 06:10:00 INFO Datastore.Schema: Validating 0 foreign key(s) for table DBS 13/05/19 06:10:00 INFO Datastore.Schema: Validating 2 unique key(s) for table DBS 13/05/19 06:10:00 INFO Datastore.Schema: Validating 2 index(es) for table DATABASE_PARAMS 13/05/19 06:10:00 INFO Datastore.Schema: Validating 1 foreign key(s) for table DATABASE_PARAMS 13/05/19 06:10:00 INFO Datastore.Schema: Validating 1 unique key(s) for table DATABASE_PARAMS 13/05/19 06:10:01 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MDatabase 13/05/19 06:10:01 INFO beeswax.Server: Started new Beeswax Thrift metaserver on port [8003]... 13/05/19 06:10:01 INFO beeswax.Server: minWorkerThreads = 5 13/05/19 06:10:01 INFO beeswax.Server: maxWorkerThreads = 2147483647 13/05/19 06:17:49 INFO metastore.HiveMetaStore: 1: get_all_databases 13/05/19 06:17:49 INFO metastore.HiveMetaStore: 1: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 13/05/19 06:17:49 INFO metastore.ObjectStore: ObjectStore, initialize called 13/05/19 06:17:49 INFO metastore.ObjectStore: Initialized ObjectStore Exception in thread "pool-1-thread-1" java.lang.NoClassDefFoundError: org/apache/thrift/scheme/StandardScheme at com.cloudera.beeswax.api.BeeswaxService$get_default_configuration_args.(BeeswaxService.java:8805) at com.cloudera.beeswax.api.BeeswaxService$Processor$get_default_configuration.getEmptyArgsInstance(BeeswaxService.java:1115) at com.cloudera.beeswax.api.BeeswaxService$Processor$get_default_configuration.getEmptyArgsInstance(BeeswaxService.java:1109) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:19) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:680) Caused by: java.lang.ClassNotFoundException: org.apache.thrift.scheme.StandardScheme at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) ... 9 more 13/05/19 06:20:11 INFO metastore.HiveMetaStore: 1: get_all_databases Exception in thread "pool-1-thread-2" java.lang.NoClassDefFoundError: Could not initialize class com.cloudera.beeswax.api.BeeswaxService$get_default_configuration_args at com.cloudera.beeswax.api.BeeswaxService$Processor$get_default_configuration.getEmptyArgsInstance(BeeswaxService.java:1115) at com.cloudera.beeswax.api.BeeswaxService$Processor$get_default_configuration.getEmptyArgsInstance(BeeswaxService.java:1109) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:19) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:680) 13/05/19 06:22:11 INFO metastore.HiveMetaStore: 1: get_all_databases Exception in thread "pool-1-thread-3" java.lang.NoClassDefFoundError: org/apache/thrift/scheme/StandardScheme at com.cloudera.beeswax.api.BeeswaxService$query_args.(BeeswaxService.java:1184) at com.cloudera.beeswax.api.BeeswaxService$Processor$query.getEmptyArgsInstance(BeeswaxService.java:905) at com.cloudera.beeswax.api.BeeswaxService$Processor$query.getEmptyArgsInstance(BeeswaxService.java:899) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:19) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:680) Caused by: java.lang.ClassNotFoundException: org.apache.thrift.scheme.StandardScheme at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at java.lang.ClassLoader.loadClass(ClassLoader.java:247)

romainr commented 11 years ago

Hi, thanks for the feedback! Yes, currently Hue is tied to its version of Hive, namely 0.10 with Hue 2.3: http://cloudera.github.io/hue/docs-2.3.0/release-notes/release-notes-2.3.0.html

This is because Beeswax it tight to it. This will be fixed in Hue 3.0.

In the meantime, replacing /usr/share/hue/apps/beeswax/src/beeswax/../../java-lib/BeeswaxServer.jar with the Beeswax.jar from Hue 2.2 might work but I can't guarantee it.

Any reason not to use Hive 0.10? (the release is good, the latest is even Hive 0.11)

rameshrk commented 11 years ago

hue Hi Romain, Thanks for your reply. I used Hive 0.9 as it is touted as the stable version. And will try with Hive 0.10 or 0.11.

BTW, similar issues are there while trying to access "File Browser" & "Job Browser". I am sending my configs in brief.. In my hdfs-site.xml, "dfs.thrift.address" is "0.0.0.0:10090" In my mapred-site.xml, "jobtracker.thrift.address" is "0.0.0.0:9290" (and "mapred.job.tracker" is "localhost:9001"). And in core-site.xml, "fs.default.name" is "hdfs://localhost:9000".

And in my hue.ini(attached, rename as hue.ini), these are the configurations,

[[hdfs_clusters]]

[[[default]]]
  # Enter the filesystem uri
  fs_defaultfs=hdfs://localhost:9000
HADOOP_HOME=/Users/hadoop/Works/hadoop-1.0.4
$HADOOP_HOME=/Users/hadoop/Works/hadoop-1.0.4
  # Change this if your HDFS cluster is Kerberos-secured
  ## security_enabled=false

  # Use WebHdfs/HttpFs as the communication mechanism.
  # This should be the web service root URL, such as
  # http://namenode:50070/webhdfs/v1
  webhdfs_url=http://127.0.0.1:50070

  # Settings about this HDFS cluster. If you install HDFS in a
  # different location, you need to set the following.

  # Defaults to $HADOOP_HDFS_HOME or /usr/lib/hadoop-hdfs
  hadoop_hdfs_home=/Users/hadoop/Works/hadoop-1.0.4

  # Defaults to $HADOOP_BIN or /usr/bin/hadoop
   hadoop_bin=/Users/hadoop/Works/hadoop-1.0.4/bin/hadoop

  # Defaults to $HADOOP_CONF_DIR or /etc/hadoop/conf
  hadoop_conf_dir=/Users/hadoop/Works/hadoop-1.0.4/conf

[[mapred_clusters]]

[[[default]]]
  # Enter the host on which you are running the Hadoop JobTracker
  jobtracker_host=localhost
  # The port where the JobTracker IPC listens on
  jobtracker_port=50030
  # Thrift plug-in port for the JobTracker
  thrift_port=9290
  # Whether to submit jobs to this cluster
  ## submit_to=True

  # Change this if your MapReduce cluster is Kerberos-secured
  ## security_enabled=false

  # Settings about this MR1 cluster. If you install MR1 in a
  # different location, you need to set the following.

  # Defaults to $HADOOP_MR1_HOME or /usr/lib/hadoop-0.20-mapreduce
  hadoop_mapred_home=/Users/hadoop/Works/hadoop-1.0.4

  # Defaults to $HADOOP_BIN or /usr/bin/hadoop
  hadoop_bin=/Users/hadoop/Works/hadoop-1.0.4/bin/hadoop

  # Defaults to $HADOOP_CONF_DIR or /etc/hadoop/conf
  hadoop_conf_dir=/Users/hadoop/Works/hadoop-1.0.4/conf

[beeswax]

Host where Beeswax server Thrift daemon is running.

If Kerberos security is enabled, the fully-qualified domain name (FQDN) is

required, even if the Thrift daemon is running on the same host as Hue.

beeswax_server_host=

Port where Beeswax Thrift server runs on.

beeswax_server_port=8002

Host where internal metastore Thrift daemon is running.

beeswax_meta_server_host=localhost

Configure the port the internal metastore daemon runs on.

Used only if hive.metastore.local is true.

beeswax_meta_server_port=8003

Hive home directory

hive_home_dir=/Users/hadoop/Works/hive-0.9.0

Hive configuration directory, where hive-site.xml is located

hive_conf_dir=/Users/hadoop/Works/hive-0.9.0/conf

Timeout in seconds for thrift calls to beeswax service

beeswax_server_conn_timeout=120

Timeout in seconds for thrift calls to the hive metastore

metastore_conn_timeout=10

Maximum Java heapsize (in megabytes) used by Beeswax Server.

Note that the setting of HADOOP_HEAPSIZE in $HADOOP_CONF_DIR/hadoop-env.sh

may override this setting.

beeswax_server_heapsize=1000

Share saved queries with all users. If set to false, saved queries are

visible only to the owner and administrators.

share_saved_queries=true

The backend to contact for queries/metadata requests.

Choices are 'beeswax' (default), 'hiveserver2'.

server_interface=beeswax

Time in milliseconds for Beeswax to persist queries in its cache.

7_24_60_60_1000 = 1 week

beeswax_running_query_lifetime=604800000L

romainr commented 11 years ago

Ok for Hive!

For HDFS and JobTracker, hue.ini defaults value are good. I guess that what you are missing is the correct Hadoop configuration: http://www.cloudera.com/content/cloudera-content/cloudera-docs/CDH4/latest/CDH4-Installation-Guide/cdh4ig_topic_15_4.html

rameshrk commented 11 years ago

Hi Romain,

Appreciating your time devoted to answer my queries. I had checked them but to no avail, the exception gets thrown. Would like to know your thoughts on the correct way to install hue. -should we take the cloudera-hadoop source distribution, build and then install hue..? Let me know..

regards Ramesh

romainr commented 11 years ago

Yes, you need to have Hive 0.10 installed on the same Hue machine.

In general this is much simpler to install a packaged version (I know some people who use Hue 2.3 with an Apache Hive tarball release, it works but require more work), e.g.: Hue package http://www.cloudera.com/content/cloudera-content/cloudera-docs/CDH4/latest/CDH4-Installation-Guide/cdh4ig_topic_15_3.html

CDH http://www.cloudera.com/content/cloudera-content/cloudera-docs/CDH4/latest/CDH4-Installation-Guide/cdh4ig_topic_4_4.html

Notice than early June the next version of CDH will include Hue 2.3 but a series of improvement.