Open tonycpsu opened 7 years ago
No, not currently. But it can be made to work with some changes.
The data collection happens in Scala, using a SparkListener
interface. This data is forwarded to the ipython kernel extension using tcp sockets.....which is again forwarded to the browser frontend using ipython/jupyter comm api provided by the ipython kernel.
So to make it work with Scala notebooks, a scala kernel extension needs to be created that does the same thing as the ipython kernel extension
(This project is part of GSOC 2017, and I am in the process of making final changes for submission next week. So I will be adding more detailed documentation of the code and how the extension works and some more uses cases in the next couple of days.)
The original use case was for monitoring the parallelization of certain libraries in python with spark. So currently only pyspark can be used. At the moment I don't have time, but I think support for scala kernels would make this extension complete!
OK, thanks much for the detailed explanation. I don't have the cycles to take on the Scala kernel extension bit myself right now, but will follow this project with interest, and perhaps take a stab at it in the future.
Any updates on the support for Scala?
Scala support would be a great feature to have.
I am in discussion with folks at the nteract community, on improving this project, support for nteract and better integration with Jupyter protocols and other things. Once things take shape, will work on Scala support.
At the moment on some time constraints, will start working on it perhaps by mid October.
@krishnan-r Good day, really cool extension. Are there any updates on extending it to Scala kernel? And what kernel you're talking about saying "Scala"?
Currently I work with Apache Toree kernel and actively looking for how to add to it the capability to see SparkUI features just inside the Notebook cell. Looks like this project may help me to solve the problem.
This looks neat, but the test notebook only shows example usage with pyspark. Does this work at all with Scala notebooks?