-
Hi,
I "flattened" an Adam file - storing chromosome 1 of HG 1000genome. Then I wanted to start an SQL-query on this data:
That's what I tried ....
scala> val sqlRDD2 = sqlContext.parquetFile("hdfs:/…
-
i install in my linux(centos6) use apache hive0.13, apache hbase0.98, apache hadoop2.4.0,
kylin0.7.1,
when I run kylin.sh start, i have an issue about hcatalog lib not found,
i didn't the reaso…
-
I, we configure all (apparently) correct and we receive the next error when run
```
Sys.setenv(HADOOP_CMD="/opt/cloudera/parcels/CDH-5.4.5-1.cdh5.4.5.p0.7/bin/hadoop")
Sys.setenv(RHIVE_HIVESERVER_VER…
-
```
See:
http://c1n8.gbif.org:50060/tasklog?attemptid=attempt_201104132224_2696_m_000000_
0&all=true
About 1/20 jobs fail for this reason, and it is not a WF bug. Something to
investigate.
Subs…
-
root@pts00450-vm8:/home/hduser/HiBench# bin/run-all.sh
Prepare aggregation ...
Exec script: /home/hduser/HiBench/workloads/aggregation/prepare/prepare.sh
Parsing conf: /home/hduser/HiBench/conf/00-def…
ghost updated
8 years ago
-
Environment: CDH 3.5
HiBench: 4.0
When running any spark bench it fails when trying to perform the Hive aggression step.
15/08/21 13:52:24 INFO cluster.SparkDeploySchedulerBackend: SchedulerBackend…
-
'es.resource' = 'apache-2014.09./apache-access' or
'es.resource' = 'apache-2014.09.29,apache-2014.09.30/apache-access'
are not working well for 'select count(*) from test' which is HiveQL.
The count …
-
Just cloned the repo and tried to build the assembly, resulting in 25 failed test.
[error] Failed: Total 106, Failed 25, Errors 0, Passed 81
[error] Failed tests:
[error] spark.jobserver.JobManag…
maasg updated
9 years ago
-
We have recently upgraded a hadoop cluster from cdh4 => cdh5.2 and also picked up a pretty big set of changes in hue.
The environment currently has approx 17k tables with a variety of column complexi…
-
Previous version of Hive, (In 0.12.x or under version) that only CDH hive have a getLog function.
So, jshs2 are to detect exists getLog function using by hiveType.
Recently version of Hive, (In 1.0.…