Open jpurnell01 opened 6 years ago
I'm using Spark 2.3.0, spark-deep-learning 1.1.0, and scala 2.11.8. I'm currently working off a local cluster with the following setup:
val spark: SparkSession = SparkSession
.builder()
.appName("test")
.master("local[*]")
.config("spark.driver.memory", "4G")
.config("spark.kryoserializer.buffer.max", "200M")
.config("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
.config("spark.network.timeout", "10001s")
.config("spark.executor.heartbeatInterval", "10000s")
.config("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
.config("spark.kryo.registrator", "org.nd4j.Nd4jRegistrator")
.getOrCreate()
I've tried using the GraphModelFactory() to load the estimator-based model exported in this regression example, but when I try to register the UDF like this:
I get and an error like this:
I'm stumped on how to debug this. Do I need to take extra steps in exporting the model?