Closed femibyte closed 4 years ago
@ancasarb
It seems like I'm not able to succssfully load the serialized model.
I ran the following in a test:
describe("URIBundleFileOps") {
it("can save/load a bundle using a URI") {
val testFile = Paths.get(testDir.toString, "mymodel_pipeline.zip")
testFile.toFile.deleteOnExit()
val uri = new URI(s"jar:file:$testFile")
println("URI="+ uri)
val loadedBasicSqlTMleapTransformer = (
for(bf <- managed(BundleFile(uri))) yield {
bf.loadMleapBundle().get
}).opt.get
and I get this error:
[info] - can save/load a bundle using a URI *** FAILED ***
[info] - BasicSqlTMleapTransformer(***)),BasicSqlTMleapModel(closest_period)) did not equal Failure(java.nio.file.NoSuchFileException: bundle.json)
I checked the model file and it does contain bundle.json
Any suggestions as to how I can debug this ?
@femibyte could you please try
val bundle = (for(bundle <- managed(BundleFile(bundlePath5))) yield {
bundle.loadMleapBundle().get
}).tried.get
to see if that gives you more info?
Thanks for the help with the bundle deserialization. I was using the wrong filename in my code. I am now getting this error from my test:
[info] BasicSqlTMleapTransformerSpec:
[info] URIBundleFileOps
[info] - can save/load a bundle using a URI *** FAILED ***
[info] java.lang.reflect.InvocationTargetException:
[info] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
[info] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
[info] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info] at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
[info] at ml.combust.mleap.bundle.ops.MleapOp.load(MleapOp.scala:24)
[info] at ml.combust.mleap.bundle.ops.MleapOp.load(MleapOp.scala:16)
[info] at ml.combust.bundle.serializer.NodeSerializer$$anonfun$read$2$$anonfun$apply$3$$anonfun$apply$4.apply(NodeSerializer.scala:106)
[info] at scala.util.Try$.apply(Try.scala:192)
[info] at ml.combust.bundle.serializer.NodeSerializer$$anonfun$read$2$$anonfun$apply$3.apply(NodeSerializer.scala:104)
[info] at ml.combust.bundle.serializer.NodeSerializer$$anonfun$read$2$$anonfun$apply$3.apply(NodeSerializer.scala:102)
[info] ...
[info] Cause: java.util.NoSuchElementException: None.get
[info] at scala.None$.get(Option.scala:347)
[info] at scala.None$.get(Option.scala:345)
[info] at ml.combust.mleap.core.types.NodeShape.input(NodeShape.scala:80)
[info] at ml.combust.mleap.runtime.frame.Transformer$$anonfun$1.apply(Transformer.scala:59)
[info] at ml.combust.mleap.runtime.frame.Transformer$$anonfun$1.apply(Transformer.scala:58)
[info] at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
[info] at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
[info] at scala.collection.immutable.List.foreach(List.scala:381)
[info] at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
[info] at scala.collection.immutable.List.map(List.scala:285)
@femibyte Did the custom transformer I sent through seem fine? Is it ok to close this issue?
@femibyte Closing this, please re-open if you have further questions. Thank you!
Hi, I am trying to run my Spark trained mleap serialized model using spring-boot as documented here:
I'm obtaining the following error when I make the curl call to do the transform and score (Step 5)
This is a custom SQL transformer that I've written so that I can serialize the Spark trained ML pipeline. The initial stage uses a SQL Transformer that is unsupported by mleap. Hence the need for the custom SQL transformer.
I follow the following steps to build and start the docker container
I then send the following curl requests.
i. Construct the payload and upload the model :
ii. Read in the input data in json format and call transform without having to encode the leap frame:
After running step ii. I receive the error above. Can someone suggest a way of debugging this ?
I am able to successfully run the custom transformer outside of spring-boot. Any suggestions are welcome.