jupyter-incubator / sparkmagic

Jupyter magics and kernels for working with remote Spark clusters
Other
1.33k stars 447 forks source link

Pandas DF from %%local to %%spark #412

Open tomaszdudek7 opened 7 years ago

tomaszdudek7 commented 7 years ago

Enable sending Pandas DF from a %%local context to the %%spark one - the functionaly should look quite like %%spark -o from the users' perspective.

tomaszdudek7 commented 7 years ago

I am working on it. Progress can be seen at https://github.com/jupyter-incubator/sparkmagic/pull/413. Any comments/suggestions appreciated.

@aggFTW

sivankumar86 commented 5 years ago

As a workaround, I used below code to pass from local to spark

` %%local

from IPython import get_ipython ipython = get_ipython() input_str = input()

line = ''

query = 'num = sc.parallelize([1, 2, 3, 4])'

ipython.run_cell_magic(magic_name='spark', line=line, cell=input_str)

`