microsoft / SynapseML

Simple and Distributed Machine Learning
http://aka.ms/spark
MIT License
5.04k stars 828 forks source link

[BUG] Failed to install SynapeML on existing cluster (Windows OS), package commons-codec not found #1671

Open dylanw-oss opened 1 year ago

dylanw-oss commented 1 year ago

SynapseML version

0.10.1

System information

scala 2.12.15 spark 3.2.2 local spark-shell on Windows 11 (the issue does not exist on Linux Ubuntu)

Describe the problem

spark-shell --packages com.microsoft.azure:synapseml_2.12:0.10.1

got error: [NOT FOUND ] commons-codec#commons-codec;1.10!commons-codec.jar (0ms)

Code to reproduce issue

Execute: spark-shell --packages com.microsoft.azure:synapseml_2.12:0.10.1

Other info / logs

:: problems summary :: :::: WARNINGS [NOT FOUND ] commons-codec#commons-codec;1.10!commons-codec.jar (0ms)

    ==== local-m2-cache: tried

      file:/C:/Users/<name>/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar

            ::::::::::::::::::::::::::::::::::::::::::::::

            ::              FAILED DOWNLOADS            ::

            :: ^ see resolution messages for details  ^ ::

            ::::::::::::::::::::::::::::::::::::::::::::::

            :: commons-codec#commons-codec;1.10!commons-codec.jar

            ::::::::::::::::::::::::::::::::::::::::::::::

:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS Exception in thread "main" java.lang.RuntimeException: [download failed: commons-codec#commons-codec;1.10!commons-codec.jar] at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1447) at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:308) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:898) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

What component(s) does this bug affect?

What language(s) does this bug affect?

What integration(s) does this bug affect?

github-actions[bot] commented 1 year ago

Hey @dylanw-oss :wave:! Thank you so much for reporting the issue/feature request :rotating_light:. Someone from SynapseML Team will be looking to triage this issue soon. We appreciate your patience.

dylanw-oss commented 1 year ago

There is a similar open issue, #771, "Error trying to install mmlspark", but no solution mentioned there works for me.

I used JDK 8 Eclipse Temurin, Windows/X64/JDK/8, https://github.com/adoptium/temurin8-binaries/releases/tag/jdk8u345-b01

%JAVA_HOME% = 'c:\jdk8' %SPARK_HOME% = 'C:\spark\spark-3.2.2-bin-hadoop3.2' %HADOOP_HOME% = 'C:\spark\spark-3.2.2-bin-hadoop3.2' %PYSPARK_DRIVER_PYTHON% = jupyter %PYSPARK_DRIVER_PYTHON_OPTS% = notebook

eerga commented 1 year ago

I have the same bug with for SynapseML 0.10.1

System information:

scala 2.12.17 spark 3.2.0 local JupyterLab on MacOS Monterrey

Path Configurations

JAVA_HOME% = '/Library/Java/JavaVirtualMachines/jdk1.8.0_341.jdk/Contents/Home' SPARK_HOME% = '/usr/local/Cellar/apache-spark/3.2.0' PYSPARK_DRIVER_PYTHON% = jupyter PYSPARK_DRIVER_PYTHON_OPTS% = notebook PYTHONPATH=$SPARK_HOME/python:$PYTHONPATH PYSPARK_PYTHON=python3 PYSPARK_SUBMIT_ARGS="--master local[3] pyspark-shell" M2_HOME="/Users/username/Desktop/Maven/apache-maven-3.8.6"

Describe the problem

spark = pyspark.sql.SparkSession.builder.appName("MyApp").config("spark.jars.packages", "com.microsoft.azure:synapseml_2.12:0.10.1").config("spark.jars.repositories", "https://mmlspark.azureedge.net/maven").getOrCreate()

got error: [NOT FOUND ] commons-codec#commons-codec;1.10!commons-codec.jar (4ms)

Code to reproduce issue

Execute:

spark = pyspark.sql.SparkSession.builder.appName("MyApp").config("spark.jars.packages", "com.microsoft.azure:synapseml_2.12:0.10.1").config("spark.jars.repositories", "https://mmlspark.azureedge.net/maven").getOrCreate()

Other info / logs

https://mmlspark.azureedge.net/maven added as a remote repository with the name: repo-1
Ivy Default Cache set to: /Users/username/.ivy2/cache
The jars for the packages stored in: /Users/username/.ivy2/jars
com.microsoft.azure#synapseml_2.12 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-19351379-b0e5-483c-8245-9eda008e21fb;1.0
    confs: [default]
:: loading settings :: url = jar:file:/usr/local/Cellar/apache-spark/3.2.0/jars/ivy-2.5.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
    found com.microsoft.azure#synapseml_2.12;0.10.1 in central
    found com.microsoft.azure#synapseml-core_2.12;0.10.1 in central
    found org.scalactic#scalactic_2.12;3.0.5 in central
    found org.scala-lang#scala-reflect;2.12.4 in central
    found io.spray#spray-json_2.12;1.3.5 in central
    found com.jcraft#jsch;0.1.54 in central
    found org.apache.httpcomponents#httpclient;4.5.6 in central
    found org.apache.httpcomponents#httpcore;4.4.10 in central
    found commons-logging#commons-logging;1.2 in central
    found commons-codec#commons-codec;1.10 in local-m2-cache
    found org.apache.httpcomponents#httpmime;4.5.6 in central
    found com.linkedin.isolation-forest#isolation-forest_3.2.0_2.12;2.0.8 in central
    found com.chuusai#shapeless_2.12;2.3.2 in central
    found org.typelevel#macro-compat_2.12;1.1.1 in central
    found org.apache.spark#spark-avro_2.12;3.2.0 in central
    found org.tukaani#xz;1.8 in central
    found org.spark-project.spark#unused;1.0.0 in local-m2-cache
    found org.testng#testng;6.8.8 in central
    found org.beanshell#bsh;2.0b4 in central
    found com.beust#jcommander;1.27 in central
    found com.microsoft.azure#synapseml-deep-learning_2.12;0.10.1 in central
    found com.microsoft.azure#synapseml-opencv_2.12;0.10.1 in central
    found org.openpnp#opencv;3.2.0-1 in central
    found com.microsoft.cntk#cntk;2.4 in central
    found com.microsoft.onnxruntime#onnxruntime_gpu;1.8.1 in central
    found com.microsoft.azure#synapseml-cognitive_2.12;0.10.1 in central
    found com.microsoft.cognitiveservices.speech#client-jar-sdk;1.14.0 in central
    found com.azure#azure-storage-blob;12.14.4 in central
    found com.azure#azure-core;1.25.0 in central
    found com.fasterxml.jackson.core#jackson-annotations;2.13.1 in central
    found com.fasterxml.jackson.core#jackson-core;2.13.1 in central
    found com.fasterxml.jackson.core#jackson-databind;2.13.1 in central
    found com.fasterxml.jackson.datatype#jackson-datatype-jsr310;2.13.1 in central
    found com.fasterxml.jackson.dataformat#jackson-dataformat-xml;2.13.1 in central
    found org.codehaus.woodstox#stax2-api;4.2.1 in central
    found com.fasterxml.woodstox#woodstox-core;6.2.7 in central
    found org.slf4j#slf4j-api;1.7.32 in central
    found io.projectreactor#reactor-core;3.4.13 in central
    found org.reactivestreams#reactive-streams;1.0.3 in central
    found io.netty#netty-tcnative-boringssl-static;2.0.46.Final in central
    found io.netty#netty-tcnative-classes;2.0.46.Final in central
    found com.azure#azure-core-http-netty;1.11.7 in central
    found io.netty#netty-handler;4.1.72.Final in central
    found io.netty#netty-common;4.1.72.Final in central
    found io.netty#netty-resolver;4.1.72.Final in central
    found io.netty#netty-buffer;4.1.72.Final in central
    found io.netty#netty-transport;4.1.72.Final in central
    found io.netty#netty-codec;4.1.72.Final in central
    found io.netty#netty-handler-proxy;4.1.72.Final in central
    found io.netty#netty-codec-socks;4.1.72.Final in central
    found io.netty#netty-codec-http;4.1.72.Final in central
    found io.netty#netty-codec-http2;4.1.72.Final in central
    found io.netty#netty-transport-native-unix-common;4.1.72.Final in central
    found io.netty#netty-transport-native-epoll;4.1.72.Final in central
    found io.netty#netty-transport-classes-epoll;4.1.72.Final in central
    found io.netty#netty-transport-native-kqueue;4.1.72.Final in central
    found io.netty#netty-transport-classes-kqueue;4.1.72.Final in central
    found io.projectreactor.netty#reactor-netty-http;1.0.14 in central
    found io.netty#netty-resolver-dns;4.1.72.Final in central
    found io.netty#netty-codec-dns;4.1.72.Final in central
    found io.netty#netty-resolver-dns-native-macos;4.1.72.Final in central
    found io.netty#netty-resolver-dns-classes-macos;4.1.72.Final in central
    found io.projectreactor.netty#reactor-netty-core;1.0.14 in central
    found com.azure#azure-storage-common;12.14.3 in central
    found com.azure#azure-storage-internal-avro;12.1.4 in central
    found com.azure#azure-ai-textanalytics;5.1.6 in central
    found com.microsoft.azure#synapseml-vw_2.12;0.10.1 in central
    found com.github.vowpalwabbit#vw-jni;8.9.1 in central
    found com.microsoft.azure#synapseml-lightgbm_2.12;0.10.1 in central
    found com.microsoft.ml.lightgbm#lightgbmlib;3.2.110 in central
:: resolution report :: resolve 1080ms :: artifacts dl 59ms
    :: modules in use:
    com.azure#azure-ai-textanalytics;5.1.6 from central in [default]
    com.azure#azure-core;1.25.0 from central in [default]
    com.azure#azure-core-http-netty;1.11.7 from central in [default]
    com.azure#azure-storage-blob;12.14.4 from central in [default]
    com.azure#azure-storage-common;12.14.3 from central in [default]
    com.azure#azure-storage-internal-avro;12.1.4 from central in [default]
    com.beust#jcommander;1.27 from central in [default]
    com.chuusai#shapeless_2.12;2.3.2 from central in [default]
    com.fasterxml.jackson.core#jackson-annotations;2.13.1 from central in [default]
    com.fasterxml.jackson.core#jackson-core;2.13.1 from central in [default]
    com.fasterxml.jackson.core#jackson-databind;2.13.1 from central in [default]
    com.fasterxml.jackson.dataformat#jackson-dataformat-xml;2.13.1 from central in [default]
    com.fasterxml.jackson.datatype#jackson-datatype-jsr310;2.13.1 from central in [default]
    com.fasterxml.woodstox#woodstox-core;6.2.7 from central in [default]
    com.github.vowpalwabbit#vw-jni;8.9.1 from central in [default]
    com.jcraft#jsch;0.1.54 from central in [default]
    com.linkedin.isolation-forest#isolation-forest_3.2.0_2.12;2.0.8 from central in [default]
    com.microsoft.azure#synapseml-cognitive_2.12;0.10.1 from central in [default]
    com.microsoft.azure#synapseml-core_2.12;0.10.1 from central in [default]
    com.microsoft.azure#synapseml-deep-learning_2.12;0.10.1 from central in [default]
    com.microsoft.azure#synapseml-lightgbm_2.12;0.10.1 from central in [default]
    com.microsoft.azure#synapseml-opencv_2.12;0.10.1 from central in [default]
    com.microsoft.azure#synapseml-vw_2.12;0.10.1 from central in [default]
    com.microsoft.azure#synapseml_2.12;0.10.1 from central in [default]
    com.microsoft.cntk#cntk;2.4 from central in [default]
    com.microsoft.cognitiveservices.speech#client-jar-sdk;1.14.0 from central in [default]
    com.microsoft.ml.lightgbm#lightgbmlib;3.2.110 from central in [default]
    com.microsoft.onnxruntime#onnxruntime_gpu;1.8.1 from central in [default]
    commons-codec#commons-codec;1.10 from local-m2-cache in [default]
    commons-logging#commons-logging;1.2 from central in [default]
    io.netty#netty-buffer;4.1.72.Final from central in [default]
    io.netty#netty-codec;4.1.72.Final from central in [default]
    io.netty#netty-codec-dns;4.1.72.Final from central in [default]
    io.netty#netty-codec-http;4.1.72.Final from central in [default]
    io.netty#netty-codec-http2;4.1.72.Final from central in [default]
    io.netty#netty-codec-socks;4.1.72.Final from central in [default]
    io.netty#netty-common;4.1.72.Final from central in [default]
    io.netty#netty-handler;4.1.72.Final from central in [default]
    io.netty#netty-handler-proxy;4.1.72.Final from central in [default]
    io.netty#netty-resolver;4.1.72.Final from central in [default]
    io.netty#netty-resolver-dns;4.1.72.Final from central in [default]
    io.netty#netty-resolver-dns-classes-macos;4.1.72.Final from central in [default]
    io.netty#netty-resolver-dns-native-macos;4.1.72.Final from central in [default]
    io.netty#netty-tcnative-boringssl-static;2.0.46.Final from central in [default]
    io.netty#netty-tcnative-classes;2.0.46.Final from central in [default]
    io.netty#netty-transport;4.1.72.Final from central in [default]
    io.netty#netty-transport-classes-epoll;4.1.72.Final from central in [default]
    io.netty#netty-transport-classes-kqueue;4.1.72.Final from central in [default]
    io.netty#netty-transport-native-epoll;4.1.72.Final from central in [default]
    io.netty#netty-transport-native-kqueue;4.1.72.Final from central in [default]
    io.netty#netty-transport-native-unix-common;4.1.72.Final from central in [default]
    io.projectreactor#reactor-core;3.4.13 from central in [default]
    io.projectreactor.netty#reactor-netty-core;1.0.14 from central in [default]
    io.projectreactor.netty#reactor-netty-http;1.0.14 from central in [default]
    io.spray#spray-json_2.12;1.3.5 from central in [default]
    org.apache.httpcomponents#httpclient;4.5.6 from central in [default]
    org.apache.httpcomponents#httpcore;4.4.10 from central in [default]
    org.apache.httpcomponents#httpmime;4.5.6 from central in [default]
    org.apache.spark#spark-avro_2.12;3.2.0 from central in [default]
    org.beanshell#bsh;2.0b4 from central in [default]
    org.codehaus.woodstox#stax2-api;4.2.1 from central in [default]
    org.openpnp#opencv;3.2.0-1 from central in [default]
    org.reactivestreams#reactive-streams;1.0.3 from central in [default]
    org.scala-lang#scala-reflect;2.12.4 from central in [default]
    org.scalactic#scalactic_2.12;3.0.5 from central in [default]
    org.slf4j#slf4j-api;1.7.32 from central in [default]
    org.spark-project.spark#unused;1.0.0 from local-m2-cache in [default]
    org.testng#testng;6.8.8 from central in [default]
    org.tukaani#xz;1.8 from central in [default]
    org.typelevel#macro-compat_2.12;1.1.1 from central in [default]
    ---------------------------------------------------------------------
    |                  |            modules            ||   artifacts   |
    |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
    ---------------------------------------------------------------------
    |      default     |   70  |   0   |   0   |   0   ||   70  |   0   |
    ---------------------------------------------------------------------

:: problems summary ::
:::: WARNINGS
        [NOT FOUND  ] commons-codec#commons-codec;1.10!commons-codec.jar (4ms)

    ==== local-m2-cache: tried

      file:/Users/username/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar

        ::::::::::::::::::::::::::::::::::::::::::::::

        ::              FAILED DOWNLOADS            ::

        :: ^ see resolution messages for details  ^ ::

        ::::::::::::::::::::::::::::::::::::::::::::::

        :: commons-codec#commons-codec;1.10!commons-codec.jar

        ::::::::::::::::::::::::::::::::::::::::::::::

:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [download failed: commons-codec#commons-codec;1.10!commons-codec.jar]
    at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1447)
    at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185)
    at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:308)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:898)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
Input In [30], in <cell line: 3>()
      1 # Please use 0.10.1 version for Spark3.2 and 0.9.5-13-d1b51517-SNAPSHOT version for Spark3.1
----> 3 spark = pyspark.sql.SparkSession.builder.appName("MyApp").config("spark.jars.packages", "com.microsoft.azure:synapseml_2.12:0.10.1").config("spark.jars.repositories", "https://mmlspark.azureedge.net/maven").getOrCreate()

File /usr/local/Cellar/apache-spark/3.2.0/python/pyspark/sql/session.py:228, in SparkSession.Builder.getOrCreate(self)
    226         sparkConf.set(key, value)
    227     # This SparkContext may be an existing one.
--> 228     sc = SparkContext.getOrCreate(sparkConf)
    229 # Do not update `SparkConf` for existing `SparkContext`, as it's shared
    230 # by all sessions.
    231 session = SparkSession(sc)

File /usr/local/Cellar/apache-spark/3.2.0/python/pyspark/context.py:392, in SparkContext.getOrCreate(cls, conf)
    390 with SparkContext._lock:
    391     if SparkContext._active_spark_context is None:
--> 392         SparkContext(conf=conf or SparkConf())
    393     return SparkContext._active_spark_context

File /usr/local/Cellar/apache-spark/3.2.0/python/pyspark/context.py:144, in SparkContext.__init__(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)
    139 if gateway is not None and gateway.gateway_parameters.auth_token is None:
    140     raise ValueError(
    141         "You are trying to pass an insecure Py4j gateway to Spark. This"
    142         " is not allowed as it is a security risk.")
--> 144 SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
    145 try:
    146     self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
    147                   conf, jsc, profiler_cls)

File /usr/local/Cellar/apache-spark/3.2.0/python/pyspark/context.py:339, in SparkContext._ensure_initialized(cls, instance, gateway, conf)
    337 with SparkContext._lock:
    338     if not SparkContext._gateway:
--> 339         SparkContext._gateway = gateway or launch_gateway(conf)
    340         SparkContext._jvm = SparkContext._gateway.jvm
    342     if instance:

File /usr/local/Cellar/apache-spark/3.2.0/python/pyspark/java_gateway.py:108, in launch_gateway(conf, popen_kwargs)
    105     time.sleep(0.1)
    107 if not os.path.isfile(conn_info_file):
--> 108     raise RuntimeError("Java gateway process exited before sending its port number")
    110 with open(conn_info_file, "rb") as info:
    111     gateway_port = read_int(info)

RuntimeError: Java gateway process exited before sending its port number
eerga commented 1 year ago

I tried to run the same code with Spark 3.1.1 but I keep on having jar issues.

System information:

scala 2.12.17 spark 3.1.1 local JupyterLab on MacOS Monterrey

Path Configurations

JAVA_HOME% = '/Library/Java/JavaVirtualMachines/jdk1.8.0_341.jdk/Contents/Home' SPARK_HOME% = '/usr/local/Cellar/apache-spark/3.1.1' PYSPARK_DRIVER_PYTHON% = jupyter PYSPARK_DRIVER_PYTHON_OPTS% = notebook PYTHONPATH=$SPARK_HOME/python:$PYTHONPATH PYSPARK_PYTHON=python3 PYSPARK_SUBMIT_ARGS="--master local[3] pyspark-shell" M2_HOME="/Users/username/Desktop/Maven/apache-maven-3.8.6"

Describe the problem

spark = pyspark.sql.SparkSession.builder.appName("MyApp").config("spark.jars.packages", "com.microsoft.azure:synapseml_2.12:0.9.5-13-d1b51517-SNAPSHOT").config("spark.jars.repositories", "https://mmlspark.azureedge.net/maven").getOrCreate()

got error: [NOT FOUND ] commons-codec#commons-codec;1.10!commons-codec.jar (0ms)

Code to reproduce issue

Execute:

spark = pyspark.sql.SparkSession.builder.appName("MyApp").config("spark.jars.packages", "com.microsoft.azure:synapseml_2.12:0.10.1").config("spark.jars.repositories", "https://mmlspark.azureedge.net/maven").getOrCreate()

Other info / logs

WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/usr/local/Cellar/apache-spark/3.1.1/jars/spark-unsafe_2.12-3.1.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
https://mmlspark.azureedge.net/maven added as a remote repository with the name: repo-1
:: loading settings :: url = jar:file:/usr/local/Cellar/apache-spark/3.1.1/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
Ivy Default Cache set to: /Users/username/.ivy2/cache
The jars for the packages stored in: /Users/username/.ivy2/jars
com.microsoft.azure#synapseml_2.12 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-276089cc-8d90-4cd4-8f87-f30cc41c9da7;1.0
    confs: [default]
    found com.microsoft.azure#synapseml_2.12;0.9.5-13-d1b51517-SNAPSHOT in repo-1
    found com.microsoft.azure#synapseml-core_2.12;0.9.5-13-d1b51517-SNAPSHOT in repo-1
    found com.fasterxml.jackson.module#jackson-module-scala_2.12;2.12.5 in central
    found com.fasterxml.jackson.core#jackson-core;2.12.5 in central
    found com.fasterxml.jackson.core#jackson-annotations;2.12.5 in central
    found com.fasterxml.jackson.core#jackson-databind;2.12.5 in central
    found com.thoughtworks.paranamer#paranamer;2.8 in local-m2-cache
    found org.scalactic#scalactic_2.12;3.0.5 in central
    found org.scala-lang#scala-reflect;2.12.4 in central
    found io.spray#spray-json_2.12;1.3.2 in central
    found com.jcraft#jsch;0.1.54 in central
    found org.apache.httpcomponents#httpclient;4.5.6 in central
    found org.apache.httpcomponents#httpcore;4.4.10 in central
    found commons-logging#commons-logging;1.2 in central
    found commons-codec#commons-codec;1.10 in local-m2-cache
    found org.apache.httpcomponents#httpmime;4.5.6 in central
    found com.linkedin.isolation-forest#isolation-forest_3.0.0_2.12;1.0.1 in central
    found com.chuusai#shapeless_2.12;2.3.2 in central
    found org.typelevel#macro-compat_2.12;1.1.1 in central
    found org.apache.spark#spark-avro_2.12;3.0.0 in central
    found org.spark-project.spark#unused;1.0.0 in local-m2-cache
    found org.testng#testng;6.8.8 in central
    found org.beanshell#bsh;2.0b4 in central
    found com.beust#jcommander;1.27 in central
    found com.microsoft.azure#synapseml-deep-learning_2.12;0.9.5-13-d1b51517-SNAPSHOT in repo-1
    found com.microsoft.azure#synapseml-opencv_2.12;0.9.5-13-d1b51517-SNAPSHOT in repo-1
    found org.openpnp#opencv;3.2.0-1 in central
    found com.microsoft.cntk#cntk;2.4 in central
    found com.microsoft.onnxruntime#onnxruntime_gpu;1.8.1 in central
    found com.microsoft.azure#synapseml-cognitive_2.12;0.9.5-13-d1b51517-SNAPSHOT in repo-1
    found com.microsoft.cognitiveservices.speech#client-jar-sdk;1.14.0 in central
    found com.azure#azure-storage-blob;12.14.2 in central
    found com.azure#azure-core;1.22.0 in central
    found com.fasterxml.jackson.datatype#jackson-datatype-jsr310;2.12.5 in central
    found com.fasterxml.jackson.dataformat#jackson-dataformat-xml;2.12.5 in central
    found com.fasterxml.jackson.module#jackson-module-jaxb-annotations;2.12.5 in central
    found jakarta.xml.bind#jakarta.xml.bind-api;2.3.2 in central
    found jakarta.activation#jakarta.activation-api;1.2.1 in central
    found org.codehaus.woodstox#stax2-api;4.2.1 in central
    found com.fasterxml.woodstox#woodstox-core;6.2.4 in central
    found org.slf4j#slf4j-api;1.7.32 in central
    found io.projectreactor#reactor-core;3.4.10 in central
    found org.reactivestreams#reactive-streams;1.0.3 in central
    found com.azure#azure-core-http-netty;1.11.2 in central
    found io.netty#netty-handler;4.1.68.Final in central
    found io.netty#netty-common;4.1.68.Final in central
    found io.netty#netty-resolver;4.1.68.Final in central
    found io.netty#netty-buffer;4.1.68.Final in central
    found io.netty#netty-transport;4.1.68.Final in central
    found io.netty#netty-codec;4.1.68.Final in central
    found io.netty#netty-handler-proxy;4.1.68.Final in central
    found io.netty#netty-codec-socks;4.1.68.Final in central
    found io.netty#netty-codec-http;4.1.68.Final in central
    found io.netty#netty-codec-http2;4.1.68.Final in central
    found io.netty#netty-transport-native-unix-common;4.1.68.Final in central
    found io.netty#netty-transport-native-epoll;4.1.68.Final in central
    found io.netty#netty-transport-native-kqueue;4.1.68.Final in central
    found io.projectreactor.netty#reactor-netty-http;1.0.11 in central
    found io.netty#netty-resolver-dns;4.1.68.Final in central
    found io.netty#netty-codec-dns;4.1.68.Final in central
    found io.netty#netty-resolver-dns-native-macos;4.1.68.Final in central
    found io.projectreactor.netty#reactor-netty-core;1.0.11 in central
    found com.azure#azure-storage-common;12.14.1 in central
    found com.azure#azure-storage-internal-avro;12.1.2 in central
    found com.azure#azure-ai-textanalytics;5.1.4 in central
    found com.microsoft.azure#synapseml-vw_2.12;0.9.5-13-d1b51517-SNAPSHOT in repo-1
    found com.github.vowpalwabbit#vw-jni;8.9.1 in central
    found com.microsoft.azure#synapseml-lightgbm_2.12;0.9.5-13-d1b51517-SNAPSHOT in repo-1
    found com.microsoft.ml.lightgbm#lightgbmlib;3.2.110 in central
downloading https://mmlspark.azureedge.net/maven/com/microsoft/azure/synapseml_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml_2.12-0.9.5-13-d1b51517-SNAPSHOT.jar ...
    [SUCCESSFUL ] com.microsoft.azure#synapseml_2.12;0.9.5-13-d1b51517-SNAPSHOT!synapseml_2.12.jar (35ms)
downloading https://mmlspark.azureedge.net/maven/com/microsoft/azure/synapseml-core_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-core_2.12-0.9.5-13-d1b51517-SNAPSHOT.jar ...
    [SUCCESSFUL ] com.microsoft.azure#synapseml-core_2.12;0.9.5-13-d1b51517-SNAPSHOT!synapseml-core_2.12.jar (236ms)
downloading https://mmlspark.azureedge.net/maven/com/microsoft/azure/synapseml-deep-learning_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-deep-learning_2.12-0.9.5-13-d1b51517-SNAPSHOT.jar ...
    [SUCCESSFUL ] com.microsoft.azure#synapseml-deep-learning_2.12;0.9.5-13-d1b51517-SNAPSHOT!synapseml-deep-learning_2.12.jar (43ms)
downloading https://mmlspark.azureedge.net/maven/com/microsoft/azure/synapseml-cognitive_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-cognitive_2.12-0.9.5-13-d1b51517-SNAPSHOT.jar ...
    [SUCCESSFUL ] com.microsoft.azure#synapseml-cognitive_2.12;0.9.5-13-d1b51517-SNAPSHOT!synapseml-cognitive_2.12.jar (123ms)
downloading https://mmlspark.azureedge.net/maven/com/microsoft/azure/synapseml-vw_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-vw_2.12-0.9.5-13-d1b51517-SNAPSHOT.jar ...
    [SUCCESSFUL ] com.microsoft.azure#synapseml-vw_2.12;0.9.5-13-d1b51517-SNAPSHOT!synapseml-vw_2.12.jar (45ms)
downloading https://mmlspark.azureedge.net/maven/com/microsoft/azure/synapseml-lightgbm_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-lightgbm_2.12-0.9.5-13-d1b51517-SNAPSHOT.jar ...
    [SUCCESSFUL ] com.microsoft.azure#synapseml-lightgbm_2.12;0.9.5-13-d1b51517-SNAPSHOT!synapseml-lightgbm_2.12.jar (46ms)
downloading https://mmlspark.azureedge.net/maven/com/microsoft/azure/synapseml-opencv_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-opencv_2.12-0.9.5-13-d1b51517-SNAPSHOT.jar ...
    [SUCCESSFUL ] com.microsoft.azure#synapseml-opencv_2.12;0.9.5-13-d1b51517-SNAPSHOT!synapseml-opencv_2.12.jar (40ms)
downloading https://repo1.maven.org/maven2/com/fasterxml/jackson/module/jackson-module-scala_2.12/2.12.5/jackson-module-scala_2.12-2.12.5.jar ...
    [SUCCESSFUL ] com.fasterxml.jackson.module#jackson-module-scala_2.12;2.12.5!jackson-module-scala_2.12.jar(bundle) (70ms)
downloading https://repo1.maven.org/maven2/com/linkedin/isolation-forest/isolation-forest_3.0.0_2.12/1.0.1/isolation-forest_3.0.0_2.12-1.0.1.jar ...
    [SUCCESSFUL ] com.linkedin.isolation-forest#isolation-forest_3.0.0_2.12;1.0.1!isolation-forest_3.0.0_2.12.jar (34ms)
downloading https://repo1.maven.org/maven2/org/apache/spark/spark-avro_2.12/3.0.0/spark-avro_2.12-3.0.0.jar ...
    [SUCCESSFUL ] org.apache.spark#spark-avro_2.12;3.0.0!spark-avro_2.12.jar (34ms)
:: resolution report :: resolve 11160ms :: artifacts dl 740ms
    :: modules in use:
    com.azure#azure-ai-textanalytics;5.1.4 from central in [default]
    com.azure#azure-core;1.22.0 from central in [default]
    com.azure#azure-core-http-netty;1.11.2 from central in [default]
    com.azure#azure-storage-blob;12.14.2 from central in [default]
    com.azure#azure-storage-common;12.14.1 from central in [default]
    com.azure#azure-storage-internal-avro;12.1.2 from central in [default]
    com.beust#jcommander;1.27 from central in [default]
    com.chuusai#shapeless_2.12;2.3.2 from central in [default]
    com.fasterxml.jackson.core#jackson-annotations;2.12.5 from central in [default]
    com.fasterxml.jackson.core#jackson-core;2.12.5 from central in [default]
    com.fasterxml.jackson.core#jackson-databind;2.12.5 from central in [default]
    com.fasterxml.jackson.dataformat#jackson-dataformat-xml;2.12.5 from central in [default]
    com.fasterxml.jackson.datatype#jackson-datatype-jsr310;2.12.5 from central in [default]
    com.fasterxml.jackson.module#jackson-module-jaxb-annotations;2.12.5 from central in [default]
    com.fasterxml.jackson.module#jackson-module-scala_2.12;2.12.5 from central in [default]
    com.fasterxml.woodstox#woodstox-core;6.2.4 from central in [default]
    com.github.vowpalwabbit#vw-jni;8.9.1 from central in [default]
    com.jcraft#jsch;0.1.54 from central in [default]
    com.linkedin.isolation-forest#isolation-forest_3.0.0_2.12;1.0.1 from central in [default]
    com.microsoft.azure#synapseml-cognitive_2.12;0.9.5-13-d1b51517-SNAPSHOT from repo-1 in [default]
    com.microsoft.azure#synapseml-core_2.12;0.9.5-13-d1b51517-SNAPSHOT from repo-1 in [default]
    com.microsoft.azure#synapseml-deep-learning_2.12;0.9.5-13-d1b51517-SNAPSHOT from repo-1 in [default]
    com.microsoft.azure#synapseml-lightgbm_2.12;0.9.5-13-d1b51517-SNAPSHOT from repo-1 in [default]
    com.microsoft.azure#synapseml-opencv_2.12;0.9.5-13-d1b51517-SNAPSHOT from repo-1 in [default]
    com.microsoft.azure#synapseml-vw_2.12;0.9.5-13-d1b51517-SNAPSHOT from repo-1 in [default]
    com.microsoft.azure#synapseml_2.12;0.9.5-13-d1b51517-SNAPSHOT from repo-1 in [default]
    com.microsoft.cntk#cntk;2.4 from central in [default]
    com.microsoft.cognitiveservices.speech#client-jar-sdk;1.14.0 from central in [default]
    com.microsoft.ml.lightgbm#lightgbmlib;3.2.110 from central in [default]
    com.microsoft.onnxruntime#onnxruntime_gpu;1.8.1 from central in [default]
    com.thoughtworks.paranamer#paranamer;2.8 from local-m2-cache in [default]
    commons-codec#commons-codec;1.10 from local-m2-cache in [default]
    commons-logging#commons-logging;1.2 from central in [default]
    io.netty#netty-buffer;4.1.68.Final from central in [default]
    io.netty#netty-codec;4.1.68.Final from central in [default]
    io.netty#netty-codec-dns;4.1.68.Final from central in [default]
    io.netty#netty-codec-http;4.1.68.Final from central in [default]
    io.netty#netty-codec-http2;4.1.68.Final from central in [default]
    io.netty#netty-codec-socks;4.1.68.Final from central in [default]
    io.netty#netty-common;4.1.68.Final from central in [default]
    io.netty#netty-handler;4.1.68.Final from central in [default]
    io.netty#netty-handler-proxy;4.1.68.Final from central in [default]
    io.netty#netty-resolver;4.1.68.Final from central in [default]
    io.netty#netty-resolver-dns;4.1.68.Final from central in [default]
    io.netty#netty-resolver-dns-native-macos;4.1.68.Final from central in [default]
    io.netty#netty-transport;4.1.68.Final from central in [default]
    io.netty#netty-transport-native-epoll;4.1.68.Final from central in [default]
    io.netty#netty-transport-native-kqueue;4.1.68.Final from central in [default]
    io.netty#netty-transport-native-unix-common;4.1.68.Final from central in [default]
    io.projectreactor#reactor-core;3.4.10 from central in [default]
    io.projectreactor.netty#reactor-netty-core;1.0.11 from central in [default]
    io.projectreactor.netty#reactor-netty-http;1.0.11 from central in [default]
    io.spray#spray-json_2.12;1.3.2 from central in [default]
    jakarta.activation#jakarta.activation-api;1.2.1 from central in [default]
    jakarta.xml.bind#jakarta.xml.bind-api;2.3.2 from central in [default]
    org.apache.httpcomponents#httpclient;4.5.6 from central in [default]
    org.apache.httpcomponents#httpcore;4.4.10 from central in [default]
    org.apache.httpcomponents#httpmime;4.5.6 from central in [default]
    org.apache.spark#spark-avro_2.12;3.0.0 from central in [default]
    org.beanshell#bsh;2.0b4 from central in [default]
    org.codehaus.woodstox#stax2-api;4.2.1 from central in [default]
    org.openpnp#opencv;3.2.0-1 from central in [default]
    org.reactivestreams#reactive-streams;1.0.3 from central in [default]
    org.scala-lang#scala-reflect;2.12.4 from central in [default]
    org.scalactic#scalactic_2.12;3.0.5 from central in [default]
    org.slf4j#slf4j-api;1.7.32 from central in [default]
    org.spark-project.spark#unused;1.0.0 from local-m2-cache in [default]
    org.testng#testng;6.8.8 from central in [default]
    org.typelevel#macro-compat_2.12;1.1.1 from central in [default]
    ---------------------------------------------------------------------
    |                  |            modules            ||   artifacts   |
    |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
    ---------------------------------------------------------------------
    |      default     |   69  |   11  |   11  |   0   ||   69  |   10  |
    ---------------------------------------------------------------------

:: problems summary ::
:::: WARNINGS
        [NOT FOUND  ] com.thoughtworks.paranamer#paranamer;2.8!paranamer.jar(bundle) (0ms)

    ==== local-m2-cache: tried

      file:/Users/username/.m2/repository/com/thoughtworks/paranamer/paranamer/2.8/paranamer-2.8.jar

        [NOT FOUND  ] commons-codec#commons-codec;1.10!commons-codec.jar (0ms)

    ==== local-m2-cache: tried

      file:/Users/username/.m2/repository/commons-codec/commons-codec/1.10/commons-codec-1.10.jar

        ::::::::::::::::::::::::::::::::::::::::::::::

        ::              FAILED DOWNLOADS            ::

        :: ^ see resolution messages for details  ^ ::

        ::::::::::::::::::::::::::::::::::::::::::::::

        :: com.thoughtworks.paranamer#paranamer;2.8!paranamer.jar(bundle)

        :: commons-codec#commons-codec;1.10!commons-codec.jar

        ::::::::::::::::::::::::::::::::::::::::::::::

:::: ERRORS
    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml_2.12/0.9.5-13-d1b51517-SNAPSHOT/maven-metadata.xml

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml_2.12-0.9.5-13-d1b51517-SNAPSHOT.pom

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml_2.12-0.9.5-13-d1b51517-SNAPSHOT.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml_2.12-0.9.5-13-d1b51517-SNAPSHOT-sources.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml_2.12-0.9.5-13-d1b51517-SNAPSHOT-javadoc.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-core_2.12/0.9.5-13-d1b51517-SNAPSHOT/maven-metadata.xml

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-core_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-core_2.12-0.9.5-13-d1b51517-SNAPSHOT.pom

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-core_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-core_2.12-0.9.5-13-d1b51517-SNAPSHOT.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-core_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-core_2.12-0.9.5-13-d1b51517-SNAPSHOT-sources.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-core_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-core_2.12-0.9.5-13-d1b51517-SNAPSHOT-javadoc.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/thoughtworks/paranamer/paranamer-parent/2.8/paranamer-parent-2.8.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/org/apache/spark/spark-parent_2.12/3.0.0/spark-parent_2.12-3.0.0.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-deep-learning_2.12/0.9.5-13-d1b51517-SNAPSHOT/maven-metadata.xml

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-deep-learning_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-deep-learning_2.12-0.9.5-13-d1b51517-SNAPSHOT.pom

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-deep-learning_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-deep-learning_2.12-0.9.5-13-d1b51517-SNAPSHOT.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-deep-learning_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-deep-learning_2.12-0.9.5-13-d1b51517-SNAPSHOT-sources.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-deep-learning_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-deep-learning_2.12-0.9.5-13-d1b51517-SNAPSHOT-javadoc.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-opencv_2.12/0.9.5-13-d1b51517-SNAPSHOT/maven-metadata.xml

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-opencv_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-opencv_2.12-0.9.5-13-d1b51517-SNAPSHOT.pom

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-opencv_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-opencv_2.12-0.9.5-13-d1b51517-SNAPSHOT.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-opencv_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-opencv_2.12-0.9.5-13-d1b51517-SNAPSHOT-sources.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-opencv_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-opencv_2.12-0.9.5-13-d1b51517-SNAPSHOT-javadoc.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-cognitive_2.12/0.9.5-13-d1b51517-SNAPSHOT/maven-metadata.xml

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-cognitive_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-cognitive_2.12-0.9.5-13-d1b51517-SNAPSHOT.pom

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-cognitive_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-cognitive_2.12-0.9.5-13-d1b51517-SNAPSHOT.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-cognitive_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-cognitive_2.12-0.9.5-13-d1b51517-SNAPSHOT-sources.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-cognitive_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-cognitive_2.12-0.9.5-13-d1b51517-SNAPSHOT-javadoc.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-vw_2.12/0.9.5-13-d1b51517-SNAPSHOT/maven-metadata.xml

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-vw_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-vw_2.12-0.9.5-13-d1b51517-SNAPSHOT.pom

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-vw_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-vw_2.12-0.9.5-13-d1b51517-SNAPSHOT.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-vw_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-vw_2.12-0.9.5-13-d1b51517-SNAPSHOT-sources.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-vw_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-vw_2.12-0.9.5-13-d1b51517-SNAPSHOT-javadoc.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-lightgbm_2.12/0.9.5-13-d1b51517-SNAPSHOT/maven-metadata.xml

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-lightgbm_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-lightgbm_2.12-0.9.5-13-d1b51517-SNAPSHOT.pom

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-lightgbm_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-lightgbm_2.12-0.9.5-13-d1b51517-SNAPSHOT.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-lightgbm_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-lightgbm_2.12-0.9.5-13-d1b51517-SNAPSHOT-sources.jar

    SERVER ERROR: Bad Gateway url=https://dl.bintray.com/spark-packages/maven/com/microsoft/azure/synapseml-lightgbm_2.12/0.9.5-13-d1b51517-SNAPSHOT/synapseml-lightgbm_2.12-0.9.5-13-d1b51517-SNAPSHOT-javadoc.jar

:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [download failed: com.thoughtworks.paranamer#paranamer;2.8!paranamer.jar(bundle), download failed: commons-codec#commons-codec;1.10!commons-codec.jar]
    at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1420)
    at org.apache.spark.deploy.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:54)
    at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:308)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1030)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1039)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
---------------------------------------------------------------------------
Exception                                 Traceback (most recent call last)
Input In [8], in <cell line: 4>()
      1 # Bootstrap Spark Session
      2 # Please use 0.10.1 version for Spark3.2 and 0.9.5-13-d1b51517-SNAPSHOT version for Spark3.1
----> 4 spark = pyspark.sql.SparkSession.builder.appName("MyApp").config("spark.jars.packages", "com.microsoft.azure:synapseml_2.12:0.9.5-13-d1b51517-SNAPSHOT").config("spark.jars.repositories", "https://mmlspark.azureedge.net/maven").getOrCreate()

File /usr/local/Cellar/apache-spark/3.1.1/python/pyspark/sql/session.py:228, in SparkSession.Builder.getOrCreate(self)
    226         sparkConf.set(key, value)
    227     # This SparkContext may be an existing one.
--> 228     sc = SparkContext.getOrCreate(sparkConf)
    229 # Do not update `SparkConf` for existing `SparkContext`, as it's shared
    230 # by all sessions.
    231 session = SparkSession(sc)

File /usr/local/Cellar/apache-spark/3.1.1/python/pyspark/context.py:384, in SparkContext.getOrCreate(cls, conf)
    382 with SparkContext._lock:
    383     if SparkContext._active_spark_context is None:
--> 384         SparkContext(conf=conf or SparkConf())
    385     return SparkContext._active_spark_context

File /usr/local/Cellar/apache-spark/3.1.1/python/pyspark/context.py:144, in SparkContext.__init__(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer, conf, gateway, jsc, profiler_cls)
    139 if gateway is not None and gateway.gateway_parameters.auth_token is None:
    140     raise ValueError(
    141         "You are trying to pass an insecure Py4j gateway to Spark. This"
    142         " is not allowed as it is a security risk.")
--> 144 SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
    145 try:
    146     self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
    147                   conf, jsc, profiler_cls)

File /usr/local/Cellar/apache-spark/3.1.1/python/pyspark/context.py:331, in SparkContext._ensure_initialized(cls, instance, gateway, conf)
    329 with SparkContext._lock:
    330     if not SparkContext._gateway:
--> 331         SparkContext._gateway = gateway or launch_gateway(conf)
    332         SparkContext._jvm = SparkContext._gateway.jvm
    334     if instance:

File /usr/local/Cellar/apache-spark/3.1.1/python/pyspark/java_gateway.py:108, in launch_gateway(conf, popen_kwargs)
    105     time.sleep(0.1)
    107 if not os.path.isfile(conn_info_file):
--> 108     raise Exception("Java gateway process exited before sending its port number")
    110 with open(conn_info_file, "rb") as info:
    111     gateway_port = read_int(info)

Exception: Java gateway process exited before sending its port number
spark
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
Input In [9], in <cell line: 1>()
----> 1 spark

NameError: name 'spark' is not defined
df = spark.read.format("csv").option("header", True).load("https://raw.githubusercontent.com/jr-MS/MVAD-in-Synapse/main/spark-demo-data.csv")
​
df = df.withColumn("sensor_1", col("sensor_1").cast(DoubleType())) \
    .withColumn("sensor_2", col("sensor_2").cast(DoubleType())) \
    .withColumn("sensor_3", col("sensor_3").cast(DoubleType()))
​
df.show(10)
---------------------------------------------------------------------------
Py4JJavaError                             Traceback (most recent call last)
Input In [40], in <cell line: 1>()
----> 1 df = spark.read.format("csv").option("header", True).load("https://raw.githubusercontent.com/jr-MS/MVAD-in-Synapse/main/spark-demo-data.csv")
      3 df = df.withColumn("sensor_1", col("sensor_1").cast(DoubleType())) \
      4     .withColumn("sensor_2", col("sensor_2").cast(DoubleType())) \
      5     .withColumn("sensor_3", col("sensor_3").cast(DoubleType()))
      7 df.show(10)

File /usr/local/Cellar/apache-spark/3.1.1/python/pyspark/sql/readwriter.py:204, in DataFrameReader.load(self, path, format, schema, **options)
    202 self.options(**options)
    203 if isinstance(path, str):
--> 204     return self._df(self._jreader.load(path))
    205 elif path is not None:
    206     if type(path) != list:

File /opt/miniconda3/lib/python3.9/site-packages/py4j/java_gateway.py:1309, in JavaMember.__call__(self, *args)
   1303 command = proto.CALL_COMMAND_NAME +\
   1304     self.command_header +\
   1305     args_command +\
   1306     proto.END_COMMAND_PART
   1308 answer = self.gateway_client.send_command(command)
-> 1309 return_value = get_return_value(
   1310     answer, self.gateway_client, self.target_id, self.name)
   1312 for temp_arg in temp_args:
   1313     temp_arg._detach()

File /usr/local/Cellar/apache-spark/3.1.1/python/pyspark/sql/utils.py:111, in capture_sql_exception.<locals>.deco(*a, **kw)
    109 def deco(*a, **kw):
    110     try:
--> 111         return f(*a, **kw)
    112     except py4j.protocol.Py4JJavaError as e:
    113         converted = convert_exception(e.java_exception)

File /opt/miniconda3/lib/python3.9/site-packages/py4j/protocol.py:326, in get_return_value(answer, gateway_client, target_id, name)
    324 value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
    325 if answer[1] == REFERENCE_TYPE:
--> 326     raise Py4JJavaError(
    327         "An error occurred while calling {0}{1}{2}.\n".
    328         format(target_id, ".", name), value)
    329 else:
    330     raise Py4JError(
    331         "An error occurred while calling {0}{1}{2}. Trace:\n{3}\n".
    332         format(target_id, ".", name, value))

Py4JJavaError: An error occurred while calling o49.load.
: java.io.IOException: No FileSystem for scheme: https
    at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2660)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2667)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
    at org.apache.spark.sql.execution.streaming.FileStreamSink$.hasMetadata(FileStreamSink.scala:46)
    at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:376)
    at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:326)
    at org.apache.spark.sql.DataFrameReader.$anonfun$load$3(DataFrameReader.scala:308)
    at scala.Option.getOrElse(Option.scala:189)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:308)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:240)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:282)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:238)
    at java.lang.Thread.run(Thread.java:750)
mhamilton723 commented 1 year ago

Hey @dylanw-oss thanks for lodging this issue, is this transient or always happeneing? Seems like an issue in the resolver, but not sure because commons-codec doesent come from us

dylanw-oss commented 1 year ago

This is always happening for me.

eerga commented 1 year ago

@dylanw-oss, there is a jar that's missing. just download it from here: https://archive.apache.org/dist/commons/codec/binaries/commons-codec-1.10-bin.zip and put it into an appropriate location. For me, it was /Users/username/.m2/repository/commons-codec/commons-codec/1.10/ folder. For you it should be at this directory: C:/Users//.m2/repository/commons-codec/commons-codec/1.10/

mhamilton723 commented 1 year ago

This looks like an issue with a bad ivy cache state.

https://stackoverflow.com/questions/19751614/unresolved-dependencies-for-commons-codec

Please follow the top answer here and delete this entry from ivy cache to get it out of it's bad state

CLosing this issue for now but you can comment to re-open!

dylanw-oss commented 1 year ago

@mhamilton723 , deleting ivy cache of all subfolders does not help, still hit the same issue. @eerga 's solution works for me. in .m2/repository/commons-codec/commons-codec/1.10/ folder, there is no jar files, I manually download the zip file and place the jar into this folder and the issue gone. Btw, I tested on Ubuntu there is no such issue at all.