Open nagavarunkumarkolla opened 8 years ago
I have an other :
scala> val tpcds = new TPCDS (sqlContext = sqlContext)
error: missing or invalid dependency detected while loading class file 'Benchmarkable.class'.
Could not access term typesafe in package com,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with -Ylog-classpath
to see the problematic classpath.)
A full rebuild may help if 'Benchmarkable.class' was compiled against an incompatible version of com.
error: missing or invalid dependency detected while loading class file 'Benchmarkable.class'.
Could not access term scalalogging in value com.typesafe,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with -Ylog-classpath
to see the problematic classpath.)
A full rebuild may help if 'Benchmarkable.class' was compiled against an incompatible version of com.typesafe.
@nagavarunkumarkolla Were you able to solve it? @lordk911 Did you try using version 0.4.3? That helped solve the error for me.
@lordk911 see the same problem as you did with master version, solved that by adding parameter "--packages com.typesafe.scala-logging:scala-logging-slf4j_2.10:2.1.2" to spark-shell
@orcguru I followed as mentioned added packages com.typesafe.scala-logging:scala-logging-slf4j_2.10:2.1.2" and it worked as expected but while creating dataframe for results i am getting loader constraint violation: when resolving method "org.apache.spark.sql.SQLContext.createDataFrame(Lscala/collection/Seq;Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/Dataset;" the class loader (instance of org/apache/spark/util/ChildFirstURLClassLoader) of the current class, com/databricks/spark/sql/perf/Benchmark$ExperimentStatus, and the class loader (instance of sun/misc/Launcher$AppClassLoader) for the method's defining class, org/apache/spark/sql/SQLContext, have different Class objects for the type scala/reflect/api/TypeTags$TypeTag used in the signature
Please help me how to resolve ? I changed the scope of dependency to provide still didn't help
Hi
I am facing below issues, when I am trying to run this code. For this command
1) val experiment = tpcds.runExperiment(tpcds.interactiveQueries)
WARN TaskSetManager: Stage 268 contains a task of very large size (331 KB). The maximum recommended task size is 100 KB.
//Here it is taking so much of time. When i press enter button, it enter into scala prompt i.e(scala >)
2) tpcds.createResultsTable()
error: value createResultsTable is not a member of com.databricks.spark.sql.perf.tpcds.TPCDS