Open soottikkal opened 6 years ago
Is this with Spark? If so it is because of the FIFO (first in, first out) nature of the standalone Spark cluster. The first notebook will get all of the Spark cluster resources, and the second notebook will be blocked until the first notebook releases those resources (i.e., you stop the first notebook).
What is the use case for the issue you are experiencing? Maybe a solution would be we allow the user to choose to start either a PySpark notebook or a regular Python notebook (this notebook won't block since it doesn't request resources from the Spark cluster).
Yes, this was with Jupyter+Spark app. During the OSC workshop, some students clicked on both read-only tutorial file and its copy. This resulted in having two Jupyter sessions. I had to shut-down the read-only session to proceed with the tutorials.
Ah, unfortunately I am unaware of a solution for that. Some sort of warning saying that all the Spark cluster resources are consumed by another notebook would be nice, but I am not sure how to do that off the top of my head.
For now I will keep this issue open so that @ericfranz or one of the students can prioritize and attempt this problem.
Jupyter hangs if more than one ipynb files are open.