databrickslabs / automl-toolkit

Toolkit for Apache Spark ML for Feature clean-up, feature Importance calculation suite, Information Gain selection, Distributed SMOTE, Model selection and training, Hyper parameter optimization and selection, Model interprability.
Other
191 stars 44 forks source link

java.lang.NoSuchMethodError: org.mlflow.api.proto.Service$CreateRun$Builder.setRunName #6

Closed Steve-Lee-DG closed 5 years ago

Steve-Lee-DG commented 5 years ago

Hi, After complete the FamilyRunner, I got an error code with belw

java.lang.NoSuchMethodError: org.mlflow.api.proto.Service$CreateRun$Builder.setRunName(Ljava/lang/String;)Lorg/mlflow/api/proto/Service$CreateRun$Builder;

at com.databricks.labs.automl.tracking.MLFlowTracker.com$databricks$labs$automl$tracking$MLFlowTracker$$generateMlFlowRun(MLFlowTracker.scala:148)
    at com.databricks.labs.automl.tracking.MLFlowTracker.logBest(MLFlowTracker.scala:401)
    at com.databricks.labs.automl.tracking.MLFlowTracker.logMlFlowDataAndModels(MLFlowTracker.scala:352)
    at com.databricks.labs.automl.AutomationRunner.logResultsToMlFlow(AutomationRunner.scala:1291)
    at com.databricks.labs.automl.AutomationRunner.liftedTree1$1(AutomationRunner.scala:1439)
    at com.databricks.labs.automl.AutomationRunner.executeTuning(AutomationRunner.scala:1438)
    at com.databricks.labs.automl.executor.FamilyRunner$$anonfun$execute$1.apply(FamilyRunner.scala:129)
    at com.databricks.labs.automl.executor.FamilyRunner$$anonfun$execute$1.apply(FamilyRunner.scala:119)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
    at com.databricks.labs.automl.executor.FamilyRunner.execute(FamilyRunner.scala:119)
    at linea339e92b41aa489e83cc214c9c04f05540.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1020:5)
    at linea339e92b41aa489e83cc214c9c04f05540.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-1020:53)
    at linea339e92b41aa489e83cc214c9c04f05540.$read$$iw$$iw$$iw$$iw.<init>(command-1020:55)
    at linea339e92b41aa489e83cc214c9c04f05540.$read$$iw$$iw$$iw.<init>(command-1020:57)
    at linea339e92b41aa489e83cc214c9c04f05540.$read$$iw$$iw.<init>(command-1020:59)
    at linea339e92b41aa489e83cc214c9c04f05540.$read$$iw.<init>(command-1020:61)
    at linea339e92b41aa489e83cc214c9c04f05540.$read.<init>(command-1020:63)
    at linea339e92b41aa489e83cc214c9c04f05540.$read$.<init>(command-1020:67)
    at linea339e92b41aa489e83cc214c9c04f05540.$read$.<clinit>(command-1020)
    at linea339e92b41aa489e83cc214c9c04f05540.$eval$.$print$lzycompute(<notebook>:7)
    at linea339e92b41aa489e83cc214c9c04f05540.$eval$.$print(<notebook>:6)
    at linea339e92b41aa489e83cc214c9c04f05540.$eval.$print(<notebook>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
    at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
    at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
    at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply$mcV$sp(ScalaDriverLocal.scala:197)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:197)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:197)
    at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:679)
    at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:632)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:197)
    at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$8.apply(DriverLocal.scala:368)
    at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$8.apply(DriverLocal.scala:345)
    at com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
    at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)
    at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:48)
    at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:271)
    at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:48)
    at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:345)
    at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
    at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
    at scala.util.Try$.apply(Try.scala:192)
    at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:639)
    at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:485)
    at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:597)
    at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:390)
    at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)
    at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)
    at java.lang.Thread.run(Thread.java:748)

And my code is

import com.databricks.labs.automl.executor.config.ConfigurationGenerator
import com.databricks.labs.automl.executor.FamilyRunner

val sourceData = spark.read("<DATA>")
val overrides = Map("labelCol" -> "is_attributed",
"mlFlowExperimentName" -> "<User-Defined-Name>",
"mlFlowTrackingURI" -> "<Databricks Host URI>",
"mlFlowAPIToken" -> dbutils.notebook.getContext().apiToken.get,
"mlFlowModelSaveDirectory" -> "<User-Defined-Directory>",
"inferenceConfigSaveLocation" -> "<User-Defined-Directory>",
"tunerParallelism" -> 30
)
val randomForestConfig = ConfigurationGenerator.generateConfigFromMap("RandomForest", "classifier", overrides)
val gbtConfig = ConfigurationGenerator.generateConfigFromMap("GBT", "classifier", overrides)
val logConfig = ConfigurationGenerator.generateConfigFromMap("LogisticRegression", "classifier", overrides)

val runner = FamilyRunner(sourceData, Array(logConfig)).execute()

Additionaly I installed library on my cluster, items are in below :

automatedml_2_11_0_5_1.jar
JAR
Installed
dbfs:/FileStore/jars/0391c7b8_92d3_4a41_92e4_1456ab5d4d54-automatedml_2_11_0_5_1-3990a.jar

azureml
PyPI
Uninstall pending restart

Hyperopt
PyPI
Installed

keras
PyPI
Installed

koalas
PyPI
Installed

ml.combust.mleap:mleap-spark_2.11:0.14.0
Maven
Installed

mleap
PyPI
Installed

mlflow
PyPI
Installed

org.mlflow:mlflow-client:1.2.0
Maven
Installed

org.mlflow:mlflow-scoring:1.2.0
Maven
Installed

seaborn
PyPI
Installed

sklearn
PyPI
Installed

xgboost
PyPI
Installed

xgboost4j_spark_0_90.jar
JAR
Installed
dbfs:/FileStore/jars/2afc2977_6cc0_4511_8b70_555882caa8af-xgboost4j_spark_0_90-b50ca.jar
BenWilson2 commented 5 years ago

You need to use mlflow version 0.9.1. See the quickstart requirements on the main readme for cluster config packages. We will be updating the code in a near future release to be compatible with the new JavaAPI for mlflow 1.2+, but we haven't had a chance to work on it yet.

Steve-Lee-DG commented 5 years ago

After down grade mlflow version to 0.9.1, it works. Thank you!!