jupyter-incubator / sparkmagic

Jupyter magics and kernels for working with remote Spark clusters
Other
1.33k stars 447 forks source link

Change Magic Contract #8

Closed alope107 closed 9 years ago

alope107 commented 9 years ago

Change the way that the magic is used so that it is run once to specify the livy connection and all subsequent cells are run against the remote cluster. This clears up the confusion between the local and remote namespaces without being as heavyweight of a solution as a new kernel.

aggFTW commented 9 years ago

@alope107 I'd like to capture some of the widget brainstorming we had the other day.

We could create a drop down menu that lists the endpoints and their languages. You would add/remove endpoints from that menu. That way, we prevent users from having their passwords in clear text. Furthermore, we could add a button that inserts text into a cell for a given endpoint and language (default vs sql). That way, people don't have to remember the syntax to input into the cell.

What do you think?

drorata commented 5 years ago

I know... this is an old issue. But, still, is there a way to avoid the %%spark at the beginning of each cell?

itamarst commented 5 years ago

There's a PySpark kernel these days, can you use that?

drorata commented 5 years ago

I didn't try it and it doesn't seem to be mentioned in the docs. How do PySpark kernel and sparkmagic play together?

itamarst commented 5 years ago

See the PySpark notebook in the examples.

drorata commented 5 years ago

I don't have the PySpark kernel available. Probably it is because I installed sparkmagic in a conda-environment and the kernel is not picked by conda?

Moreover, I don't understand in the example where I provide the livy URL for PySpark...