databricks / spark-deep-learning

Deep Learning Pipelines for Apache Spark
https://databricks.github.io/spark-deep-learning
Apache License 2.0
2k stars 494 forks source link

[NOT FOUND ] org.slf4j#slf4j-api;1.7.7!slf4j-api.jar #190

Closed Liangmp closed 5 years ago

Liangmp commented 5 years ago

I try to use spark-deep-learning packages in pyspark using command line ./pyspark --master yarn --packages databricks:spark-deep-learning:0.1.0-spark2.1-s_2.11, however, I get the following outputs:

Python 3.6.7 (default, Oct 22 2018, 11:32:17) 
[GCC 8.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
Ivy Default Cache set to: /home/hduser/.ivy2/cache
The jars for the packages stored in: /home/hduser/.ivy2/jars
:: loading settings :: url = jar:file:/opt/spark-2.4.0-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
databricks#spark-deep-learning added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-3f3bf06f-f9d5-4c06-a438-5366fbeae11c;1.0
    confs: [default]
    found databricks#spark-deep-learning;0.1.0-spark2.1-s_2.11 in spark-packages
    found databricks#tensorframes;0.2.8-s_2.11 in spark-packages
    found org.apache.commons#commons-proxy;1.0 in central
    found org.scalactic#scalactic_2.11;3.0.0 in central
    found org.scala-lang#scala-reflect;2.11.8 in local-m2-cache
    found org.apache.commons#commons-lang3;3.4 in central
    found com.typesafe.scala-logging#scala-logging-api_2.11;2.1.2 in central
    found com.typesafe.scala-logging#scala-logging-slf4j_2.11;2.1.2 in central
    found org.slf4j#slf4j-api;1.7.7 in local-m2-cache
    found org.tensorflow#tensorflow;1.1.0-rc1 in central
    found org.tensorflow#libtensorflow;1.1.0-rc1 in central
    found org.tensorflow#libtensorflow_jni;1.1.0-rc1 in central
:: resolution report :: resolve 370ms :: artifacts dl 9ms
    :: modules in use:
    com.typesafe.scala-logging#scala-logging-api_2.11;2.1.2 from central in [default]
    com.typesafe.scala-logging#scala-logging-slf4j_2.11;2.1.2 from central in [default]
    databricks#spark-deep-learning;0.1.0-spark2.1-s_2.11 from spark-packages in [default]
    databricks#tensorframes;0.2.8-s_2.11 from spark-packages in [default]
    org.apache.commons#commons-lang3;3.4 from central in [default]
    org.apache.commons#commons-proxy;1.0 from central in [default]
    org.scala-lang#scala-reflect;2.11.8 from local-m2-cache in [default]
    org.scalactic#scalactic_2.11;3.0.0 from central in [default]
    org.slf4j#slf4j-api;1.7.7 from local-m2-cache in [default]
    org.tensorflow#libtensorflow;1.1.0-rc1 from central in [default]
    org.tensorflow#libtensorflow_jni;1.1.0-rc1 from central in [default]
    org.tensorflow#tensorflow;1.1.0-rc1 from central in [default]
    :: evicted modules:
    org.scala-lang#scala-reflect;2.11.0 by [org.scala-lang#scala-reflect;2.11.8] in [default]
    ---------------------------------------------------------------------
    |                  |            modules            ||   artifacts   |
    |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
    ---------------------------------------------------------------------
    |      default     |   13  |   0   |   0   |   1   ||   12  |   0   |
    ---------------------------------------------------------------------

:: problems summary ::
:::: WARNINGS
        [NOT FOUND  ] org.slf4j#slf4j-api;1.7.7!slf4j-api.jar (0ms)

    ==== local-m2-cache: tried

      file:/home/hduser/.m2/repository/org/slf4j/slf4j-api/1.7.7/slf4j-api-1.7.7.jar

        ::::::::::::::::::::::::::::::::::::::::::::::

        ::              FAILED DOWNLOADS            ::

        :: ^ see resolution messages for details  ^ ::

        ::::::::::::::::::::::::::::::::::::::::::::::

        :: org.slf4j#slf4j-api;1.7.7!slf4j-api.jar

        ::::::::::::::::::::::::::::::::::::::::::::::

:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [download failed: org.slf4j#slf4j-api;1.7.7!slf4j-api.jar]
    at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1306)
    at org.apache.spark.deploy.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:54)
    at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:315)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:143)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Traceback (most recent call last):
  File "/opt/spark-2.4.0-bin-hadoop2.7/python/pyspark/shell.py", line 38, in <module>
    SparkContext._ensure_initialized()
  File "/opt/spark-2.4.0-bin-hadoop2.7/python/pyspark/context.py", line 298, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway(conf)
  File "/opt/spark-2.4.0-bin-hadoop2.7/python/pyspark/java_gateway.py", line 94, in launch_gateway
    raise Exception("Java gateway process exited before sending its port number")
Exception: Java gateway process exited before sending its port number

Can somebody tell me how to fix it? Thanks so much.

Liangmp commented 5 years ago

I found a solution provided by DerekHanqingWang, remove dir in .ivy2/cache, ivy2/jars and .m2/repository/ and run ./pyspark --master yarn --packages databricks:spark-deep-learning:1.5.0-spark2.4-s_2.11 with the newest spark-deep-learning package, it woks for me.