databricks / spark-sql-perf

Apache License 2.0
582 stars 406 forks source link

sbt package failed with unresolved dependency #203

Open haojinIntel opened 3 years ago

haojinIntel commented 3 years ago

I've installed sbt-0.13.15 in my environment. I trigger "sbt packge" and meet the following exceptions: image Is there anyone meet the similar issue? And how can I fix this problem.

eavilaes commented 3 years ago

It seems that some dependencies repositories have shut down, so you have to manage dependencies for this jar manually.

We had to remove the sbt-spark-package plugin, by erasing it from project/plugins.sbt and include the Spark dependencies manually as in this picture. We changed a few things on the following files:

On build.sbt you must change this:

image

And this on project/plugins.sbt

image

That worked for us!

Edit: I forgot to mention that we had to remove the sbt-spark-package plugin.

eavilaes commented 3 years ago

@haojinIntel I forgot to mention that you must remove the sbt-spark-package plugin, I've edited the previous message 😄

pingsutw commented 3 years ago

@evanye Whare could I download those jars, and where should I put those jars? Sorry, I'm a beginner sbt.

eavilaes commented 3 years ago

@evanye Whare could I download those jars, and where should I put those jars? Sorry, I'm a beginner sbt.

You must build the jars as explained in https://github.com/databricks/spark-sql-perf#build

AlessandroPomponio commented 3 years ago

For anyone stumbling across this issue, it can be fixed by changing in project/plugins.sbt the line: resolvers += "Spark Packages repo" at "https://dl.bintray.com/spark-packages/maven/" to resolvers += "Spark Packages repo" at "https://repos.spark-packages.org/" As already noted in PRs #204 and #206