databricks / sbt-spark-package

Sbt plugin for Spark packages
Apache License 2.0
151 stars 32 forks source link

Unresolved Dependency: org.spark-packages#sbt-spark-package #15

Closed FRosner closed 8 years ago

FRosner commented 9 years ago

I added the plugin to project/plugin.sbt like described in the readme file:

addSbtPlugin("org.spark-packages" % "sbt-spark-package" % "0.2.3")

However, SBT cannot resolve the dependency. Please find the error log below.

[info] Loading project definition from .../drunken-data-quality/project
[info] Updating {...}drunken-data-quality-build...
[info] Resolving org.spark-packages#sbt-spark-package;0.2.3 ...
[warn]  module not found: org.spark-packages#sbt-spark-package;0.2.3
[warn] ==== typesafe-ivy-releases: tried
[warn]   https://repo.typesafe.com/typesafe/ivy-releases/org.spark-packages/sbt-spark-package/scala_2.10/sbt_0.13/0.2.3/ivys/ivy.xml
[warn] ==== sbt-plugin-releases: tried
[warn]   https://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/org.spark-packages/sbt-spark-package/scala_2.10/sbt_0.13/0.2.3/ivys/ivy.xml
[warn] ==== local: tried
[warn]   /Users/frosner/.ivy2/local/org.spark-packages/sbt-spark-package/scala_2.10/sbt_0.13/0.2.3/ivys/ivy.xml
[warn] ==== jcenter: tried
[warn]   https://jcenter.bintray.com/org/spark-packages/sbt-spark-package_2.10_0.13/0.2.3/sbt-spark-package-0.2.3.pom
[warn] ==== public: tried
[warn]   https://repo1.maven.org/maven2/org/spark-packages/sbt-spark-package_2.10_0.13/0.2.3/sbt-spark-package-0.2.3.pom
[info] Resolving com.fasterxml.jackson.module#jackson-module-scala_2.10;2.5.1 [info] Resolving org.scala-sbt.ivy#ivy;2.3.0-sbt-c5d1b95fdcc1e1007740ffbecf4eb[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.spark-packages#sbt-spark-package;0.2.3: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn]  Note: Some unresolved dependencies have extra attributes.  Check that these dependencies exist with the requested attributes.
[warn]      org.spark-packages:sbt-spark-package:0.2.3 (scalaVersion=2.10, sbtVersion=0.13)
[warn]
[warn]  Note: Unresolved dependencies path:
[warn]      org.spark-packages:sbt-spark-package:0.2.3 (scalaVersion=2.10, sbtVersion=0.13) (/Users/frosner/Documents/projects/drunken-data-quality/project/plugin.sbt#L1-2)
[warn]        +- default:drunken-data-quality-build:0.1-SNAPSHOT (scalaVersion=2.10, sbtVersion=0.13)
sbt.ResolveException: unresolved dependency: org.spark-packages#sbt-spark-package;0.2.3: not found
    at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:294)
    at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:191)
    at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:168)
    at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:155)
    at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:155)
    at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:132)
    at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:57)
    at sbt.IvySbt$$anon$4.call(Ivy.scala:65)
    at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
    at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)
    at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)
    at xsbt.boot.Using$.withResource(Using.scala:10)
    at xsbt.boot.Using$.apply(Using.scala:9)
    at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
    at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
    at xsbt.boot.Locks$.apply0(Locks.scala:31)
    at xsbt.boot.Locks$.apply(Locks.scala:28)
    at sbt.IvySbt.withDefaultLogger(Ivy.scala:65)
    at sbt.IvySbt.withIvy(Ivy.scala:127)
    at sbt.IvySbt.withIvy(Ivy.scala:124)
    at sbt.IvySbt$Module.withModule(Ivy.scala:155)
    at sbt.IvyActions$.updateEither(IvyActions.scala:168)
    at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1392)
    at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1388)
    at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$90.apply(Defaults.scala:1422)
    at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$90.apply(Defaults.scala:1420)
    at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:37)
    at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1425)
    at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1419)
    at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:60)
    at sbt.Classpaths$.cachedUpdate(Defaults.scala:1442)
    at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1371)
    at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1325)
    at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
    at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
    at sbt.std.Transform$$anon$4.work(System.scala:63)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
    at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
    at sbt.Execute.work(Execute.scala:235)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
    at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
    at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
[error] (*:update) sbt.ResolveException: unresolved dependency: org.spark-packages#sbt-spark-package;0.2.3: not found
FRosner commented 8 years ago

Can be resolved by adding

resolvers += "Spark Package Main Repo" at "https://dl.bintray.com/spark-packages/maven"

to plugin.sbt.

brkyvz commented 8 years ago

Handled by #12.

rpuch commented 3 years ago

According to https://spark.apache.org/news/new-repository-service.html , we should not use dl.bintray anymore. I replaced the following

resolvers += "bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven/"

with this

resolvers += "spark-packages" at "https://repos.spark-packages.org/"

and it seems to work now.