databricks / spark-deep-learning

Deep Learning Pipelines for Apache Spark
https://databricks.github.io/spark-deep-learning
Apache License 2.0
2k stars 494 forks source link

AttributeError: module 'sparkdl' has no attribute 'graph' #209

Open CarlosVaquero opened 5 years ago

CarlosVaquero commented 5 years ago

sparkdl 0.2.2 available in pip returns the following error when importing either DeepImageFeaturizer or KerasTransformer.

It seems there is an incompatibility on the use of KerasImageFileTransformer since keras_image.py has been modified but the graph attribute not.

`--------------------------------------------------------------------------- AttributeError Traceback (most recent call last)

in 2 from pyspark.ml.evaluation import MulticlassClassificationEvaluator 3 from pyspark.ml import Pipeline ----> 4 from sparkdl import DeepImageFeaturizer 5 6 featurizer = DeepImageFeaturizer(inputCol="image", outputCol="features", modelName="InceptionV3") /conda/envs/py36/lib/python3.6/site-packages/sparkdl/__init__.py in 15 16 from sparkdl.image.imageIO import imageSchema, imageType, readImages ---> 17 from sparkdl.transformers.keras_image import KerasImageFileTransformer 18 from sparkdl.transformers.named_image import DeepImagePredictor, DeepImageFeaturizer 19 from sparkdl.transformers.tf_image import TFImageTransformer /conda/envs/py36/lib/python3.6/site-packages/sparkdl/transformers/keras_image.py in 20 from pyspark.ml.param import Params, TypeConverters 21 ---> 22 import sparkdl.graph.utils as tfx 23 from sparkdl.transformers.keras_utils import KSessionWrap 24 from sparkdl.param import ( AttributeError: module 'sparkdl' has no attribute 'graph' `
vivek-bombatkar commented 4 years ago

@CarlosVaquero Did you manage to solve this anyway !?

spark-water commented 4 years ago

I ran into such error a lot with the latest release. It's quite tedious to install one package at a time to keep eliminating the exceptions. So I took the easy way out and use older releases, instead. For me 1.3.0-spark2.4-s_2.11 works the best

Sriharikrishna06 commented 4 years ago

@spark-water how did you install the specific version of sparkdl i tried using pip install sparkdl==1.3.0-spark2.4-s_2.11 It didnt work Please help