Open haojinIntel opened 3 years ago
It seems that some dependencies repositories have shut down, so you have to manage dependencies for this jar manually.
We had to remove the sbt-spark-package plugin, by erasing it from project/plugins.sbt and include the Spark dependencies manually as in this picture. We changed a few things on the following files:
On build.sbt you must change this:
And this on project/plugins.sbt
That worked for us!
Edit: I forgot to mention that we had to remove the sbt-spark-package plugin.
@haojinIntel I forgot to mention that you must remove the sbt-spark-package plugin, I've edited the previous message 😄
@evanye Whare could I download those jars, and where should I put those jars? Sorry, I'm a beginner sbt.
@evanye Whare could I download those jars, and where should I put those jars? Sorry, I'm a beginner sbt.
You must build the jars as explained in https://github.com/databricks/spark-sql-perf#build
For anyone stumbling across this issue, it can be fixed by changing in project/plugins.sbt
the line:
resolvers += "Spark Packages repo" at "https://dl.bintray.com/spark-packages/maven/"
to
resolvers += "Spark Packages repo" at "https://repos.spark-packages.org/"
As already noted in PRs #204 and #206
I've installed sbt-0.13.15 in my environment. I trigger "sbt packge" and meet the following exceptions: Is there anyone meet the similar issue? And how can I fix this problem.