Hey all,
I'm trying to connect a jupyterlab notebook to submit spark jobs to a kerberized Hortonworks setup running hive/spark/livy.
I had been trying to get sparkmagic to work with jupyterlab, but it appears that it is not compatible, as it chokes whenever I attempt to create an endpoint, though it works fine in jupyter notebook.
I'm curious if jupyter lab has a means to connect to a remote livy server or similar so we can run jobs on a remote cluster.
Hey all, I'm trying to connect a jupyterlab notebook to submit spark jobs to a kerberized Hortonworks setup running hive/spark/livy.
I had been trying to get sparkmagic to work with jupyterlab, but it appears that it is not compatible, as it chokes whenever I attempt to create an endpoint, though it works fine in jupyter notebook.
I'm curious if jupyter lab has a means to connect to a remote livy server or similar so we can run jobs on a remote cluster.
Thank you!