Open christophelebrun opened 4 years ago
use
spark.jars.packages, instead of spark.jars. Also, I had no success using a local package (in your case, you compiled one and put in S3 bucket) due to lack of parent dependency. You should pull from databricks spark package site. I know, this would have limitations but so far I've not able to find a solution.
Hello,
I am running a jupyter notebook on a EMR instance, without access to the web. I have downloaded the .jar file of sparkdl to an s3 bucket.
I tried :
This cell run without error.
But I got an error with
from sparkdl import DeepImageFeaturizer
ModuleNotFoundError: No module named 'sparkdl'
Any idea of how to fix that ?