Closed rrekapalli closed 4 years ago
@rrekapalli it looks like no one has any suggestions for this. As such, this library is not actively maintained and it was definitely not tested with .NET for Spark. I am closing this as out of scope for this library. Thank you for your consideration and understanding.
I am getting an error message while trying to use this library to write data to an SQL Server using .NET for Apache Spark library. Following is the error message.
spark-shell --packages com.microsoft.azure:azure-sqldb-spark:1.0.2.
Ivy Default Cache set to: C:\Users\rajar.ivy2\cache The jars for the packages stored in: C:\Users\rajar.ivy2\jars :: loading settings :: url = jar:file:/C:/Spark/bin/spark-2.4.5-bin-hadoop2.7/jars/azure-sqldb-spark-1.0.1-jar-with-dependencies.jar!/org/apache/ivy/core/settings/ivysettings.xml com.microsoft.azure#azure-sqldb-spark added as a dependency :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 confs: [default] :: resolution report :: resolve 3177ms :: artifacts dl 0ms :: modules in use:
:: problems summary :: :::: WARNINGS module not found: com.microsoft.azure#azure-sqldb-spark;1.0.2.
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: com.microsoft.azure#azure-sqldb-spark;1.0.2.: not found] at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1197) at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:304) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)