microsoft / SynapseML

Simple and Distributed Machine Learning
http://aka.ms/spark
MIT License
5.07k stars 831 forks source link

java.lang.NoSuchMethodError: spray.json.package$.enrichAny(Ljava/lang/Object;)Lspray/json/RichAny; Issue #1679

Open sibyl1956 opened 2 years ago

sibyl1956 commented 2 years ago

SynapseML version

0.10.1

System information

Describe the problem

Installed the package through: com.microsoft.azure:synapseml_2.12:0.10.1 with the resolver: https://mmlspark.azureedge.net/maven

But got this error message when try to initiate a lightgbm classifier: java.lang.NoSuchMethodError: spray.json.package$.enrichAny(Ljava/lang/Object;)Lspray/json/RichAny;

3 lgbmClassifier = (LightGBMClassifier() 4 .setFeaturesCol("features") 5 .setRawPredictionCol("rawPrediction")

/databricks/spark/python/pyspark/init.py in wrapper(self, *args, kwargs) 112 raise TypeError("Method %s forces keyword arguments." % func.name) 113 self._input_kwargs = kwargs --> 114 return func(self, kwargs) 115 return wrapper 116

/local_disk0/spark-36f90ef6-a68d-4c36-ba04-d72f939344e4/userFiles-842368d3-9f91-4dec-8c2d-38752346d587/addedFile462316793479631003synapseml_lightgbm_2_12_0_10_1-c15ba.jar/synapse/ml/lightgbm/LightGBMClassifier.py in init(self, java_obj, baggingFraction, baggingFreq, baggingSeed, binSampleCount, boostFromAverage, boostingType, catSmooth, categoricalSlotIndexes, categoricalSlotNames, catl2, chunkSize, dataRandomSeed, defaultListenPort, deterministic, driverListenPort, dropRate, dropSeed, earlyStoppingRound, executionMode, extraSeed, featureFraction, featureFractionByNode, featureFractionSeed, featuresCol, featuresShapCol, fobj, improvementTolerance, initScoreCol, isEnableSparse, isProvideTrainingMetric, isUnbalance, labelCol, lambdaL1, lambdaL2, leafPredictionCol, learningRate, matrixType, maxBin, maxBinByFeature, maxCatThreshold, maxCatToOnehot, maxDeltaStep, maxDepth, maxDrop, metric, microBatchSize, minDataInLeaf, minDataPerBin, minDataPerGroup, minGainToSplit, minSumHessianInLeaf, modelString, monotoneConstraints, monotoneConstraintsMethod, monotonePenalty, negBaggingFraction, numBatches, numIterations, numLeaves, numTasks, numThreads, objective, objectiveSeed, otherRate, parallelism, passThroughArgs, posBaggingFraction, predictDisableShapeCheck, predictionCol, probabilityCol, rawPredictionCol, repartitionByGroupingColumn, seed, skipDrop, slotNames, thresholds, timeout, topK, topRate, uniformDrop, useBarrierExecutionMode, useMissing, useSingleDatasetMode, validationIndicatorCol, verbosity, weightCol, xGBoostDartMode, zeroAsMissing) 387 super(LightGBMClassifier, self).init() 388 if java_obj is None: --> 389 self._java_obj = self._new_java_obj("com.microsoft.azure.synapse.ml.lightgbm.LightGBMClassifier", self.uid) 390 else: 391 self._java_obj = java_obj

/databricks/spark/python/pyspark/ml/wrapper.py in _new_java_obj(java_class, args) 64 java_obj = getattr(java_obj, name) 65 java_args = [_py2java(sc, arg) for arg in args] ---> 66 return java_obj(java_args) 67 68 @staticmethod

/databricks/spark/python/lib/py4j-0.10.9.1-src.zip/py4j/java_gateway.py in call(self, *args) 1566 1567 answer = self._gateway_client.send_command(command) -> 1568 return_value = get_return_value( 1569 answer, self._gateway_client, None, self._fqn) 1570

/databricks/spark/python/pyspark/sql/utils.py in deco(*a, kw) 115 def deco(*a, *kw): 116 try: --> 117 return f(a, kw) 118 except py4j.protocol.Py4JJavaError as e: 119 converted = convert_exception(e.java_exception)

/databricks/spark/python/lib/py4j-0.10.9.1-src.zip/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name) 324 value = OUTPUT_CONVERTER[type](answer[2:], gateway_client) 325 if answer[1] == REFERENCE_TYPE: --> 326 raise Py4JJavaError( 327 "An error occurred while calling {0}{1}{2}.\n". 328 format(target_id, ".", name), value)

Py4JJavaError: An error occurred while calling None.com.microsoft.azure.synapse.ml.lightgbm.LightGBMClassifier. : java.lang.NoSuchMethodError: spray.json.package$.enrichAny(Ljava/lang/Object;)Lspray/json/RichAny; at com.microsoft.azure.synapse.ml.logging.BasicLogging.logBase(BasicLogging.scala:30) at com.microsoft.azure.synapse.ml.logging.BasicLogging.logBase$(BasicLogging.scala:29) at com.microsoft.azure.synapse.ml.lightgbm.LightGBMClassifier.logBase(LightGBMClassifier.scala:27) at com.microsoft.azure.synapse.ml.logging.BasicLogging.logClass(BasicLogging.scala:40) at com.microsoft.azure.synapse.ml.logging.BasicLogging.logClass$(BasicLogging.scala:39) at com.microsoft.azure.synapse.ml.lightgbm.LightGBMClassifier.logClass(LightGBMClassifier.scala:27) at com.microsoft.azure.synapse.ml.lightgbm.LightGBMClassifier.(LightGBMClassifier.scala:30) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:380) at py4j.Gateway.invoke(Gateway.java:250) at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80) at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69) at py4j.GatewayConnection.run(GatewayConnection.java:251) at java.lang.Thread.run(Thread.java:748)

Code to reproduce issue

lgbmClassifier = (LightGBMClassifier() .setFeaturesCol("features") .setRawPredictionCol("rawPrediction") .setDefaultListenPort(12402) .setNumLeaves(5) .setNumIterations(10) .setObjective("binary") .setLabelCol("labels") .setLeafPredictionCol("leafPrediction") .setFeaturesShapCol("featuresShap"))

Other info / logs

No response

What component(s) does this bug affect?

What language(s) does this bug affect?

What integration(s) does this bug affect?

github-actions[bot] commented 2 years ago

Hey @sibyl1956 :wave:! Thank you so much for reporting the issue/feature request :rotating_light:. Someone from SynapseML Team will be looking to triage this issue soon. We appreciate your patience.

pengliangml commented 2 years ago

"io.spray" %% "spray-json" % "1.3.5" There may be a problem with your spray-json version, it should be 1.3.5. Please confirm.

mhamilton723 commented 2 years ago

Agreeing with @neptune05

sandeepgudla commented 1 year ago

is this issue fixed?

yinguoqing123 commented 9 months ago

i meet the same error, and i don't find the jars of spray-json in .ivy2/jars, how can i fixed?

yinguoqing123 commented 9 months ago

Agreeing with @neptune05

i meet the same error, and i don't find the jars of spray-json in .ivy2/jars, how can i fixed?

austinzh commented 8 months ago

I have same error I think it's because I have other UBER jar bring 1.3.6. But spray.json.package$.enrichAny both exist in 1.3.5 and 1.3.6. My fix is compile spray-json 1.3.5 into one of the my UBER jar or copy it into spark's jars folder. Or maybe you can try

spark.driver.userClassPathFirst spark.executor.userClassPathFirst