Open dmvieira opened 7 years ago
Try with --Spark.url=http://0.0.0.0:4040
instead of localhost... It worked for me :)
It's not too simple since if you are using multiple notebooks, you have multiple spark ports. I don't know how this plugin works with it
New to this project. Just writing to confirm that I can reproduce -- still seems to be a live issue. Investigating solutions...
I got this to work. It turns out that if you start the Docker image and then install the plugin, it's too late, since serverextensions are only run at Jupyter server startup time. You can create your own Dockerfile, though:
FROM jupyter/all-spark-notebook
RUN pip install jupyter_spark
RUN jupyter serverextension enable --py --user jupyter_spark
RUN jupyter nbextension install --py --user jupyter_spark
RUN jupyter nbextension enable --py --user jupyter_spark
RUN jupyter nbextension enable --py --user widgetsnbextension
Then build it:
docker build --rm -t jupyter/spark-extension .
The run it using any of the methods documented here.
So, I don't think there's anything fundamentally broken here wrt Docker, but we should probably explicitly document this somewhere. @dmvieira : If you're still interested in this issue, would you mind confirming that the solution about works for you?
The issue about multiple notebooks with multiple spark ports is sort of a separate issue, and is relevant whether running in Docker or not. If you don't mind, I'm going to create a separate issue for that and keep this just to "can run in Docker".
Thx! I got it. I'll change issue title
Using jupyter docker image it doesn't work