databricks / spark-deep-learning

Deep Learning Pipelines for Apache Spark
https://databricks.github.io/spark-deep-learning
Apache License 2.0
2k stars 494 forks source link

Deep Image Featurizer for both MLeap serializeToBundle and model.save #203

Open Karl-Keller opened 5 years ago

Karl-Keller commented 5 years ago

I've tried saving an InceptionV3-based model using DeepImageFeaturizer with both MLeap and model.save receiving the following errors:

p_model.serializeToBundle("jar:file:/tmp/Images/ParkingSpaces/Models/psinception.zip", tested_df) java.util.NoSuchElementException: key not found: com.databricks.sparkdl.DeepImageFeaturizer

and

p_model.save('/dbfs://FileStore/psinception.pb') # saves to the distributed (persistent) filesystem to the URL location accessible at /files... ValueError: ('Pipeline write will fail on this pipeline because stage %s of type %s is not MLWritable', 'DeepImageFeaturizer_b717d8e93b47', <class 'sparkdl.transformers.named_image.DeepImageFeaturizer'>)

Since it's failing in both approaches and the issue resolved here seemed aimed directly at solving persistence for DIF, I thought this might be the best place to post. I've also posted in Forum but received no feedback.