Closed mcanini closed 3 years ago
Hi there , I am encountering the same issue,with version 0.12.6. Here is the stack trace:
[I 11:42:21.946 NotebookApp] KernelRestarter: restarting kernel (4/5), new random ports Traceback (most recent call last): File "/Users/yhuang/anaconda3/envs/jupyter_sparkmagic_py3.6/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/Users/yhuang/anaconda3/envs/jupyter_sparkmagic_py3.6/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/Users/yhuang/anaconda3/envs/jupyter_sparkmagic_py3.6/lib/python3.6/site-packages/sparkmagic/kernels/sparkkernel/sparkkernel.py", line 28, in <module> IPKernelApp.launch_instance(kernel_class=SparkKernel) File "/Users/yhuang/anaconda3/envs/jupyter_sparkmagic_py3.6/lib/python3.6/site-packages/traitlets/config/application.py", line 657, in launch_instance app.initialize(argv) File "<decorator-gen-139>", line 2, in initialize File "/Users/yhuang/anaconda3/envs/jupyter_sparkmagic_py3.6/lib/python3.6/site-packages/traitlets/config/application.py", line 87, in catch_config_error return method(app, *args, **kwargs) File "/Users/yhuang/anaconda3/envs/jupyter_sparkmagic_py3.6/lib/python3.6/site-packages/ipykernel/kernelapp.py", line 484, in initialize self.init_kernel() File "/Users/yhuang/anaconda3/envs/jupyter_sparkmagic_py3.6/lib/python3.6/site-packages/ipykernel/kernelapp.py", line 389, in init_kernel user_ns=self.user_ns, File "/Users/yhuang/anaconda3/envs/jupyter_sparkmagic_py3.6/lib/python3.6/site-packages/traitlets/config/configurable.py", line 412, in instance inst = cls(*args, **kwargs) File "/Users/yhuang/anaconda3/envs/jupyter_sparkmagic_py3.6/lib/python3.6/site-packages/sparkmagic/kernels/sparkkernel/sparkkernel.py", line 23, in __init__ language_info, session_language, **kwargs) File "/Users/yhuang/anaconda3/envs/jupyter_sparkmagic_py3.6/lib/python3.6/site-packages/sparkmagic/kernels/wrapperkernel/sparkkernelbase.py", line 42, in __init__ self._load_magics_extension() File "/Users/yhuang/anaconda3/envs/jupyter_sparkmagic_py3.6/lib/python3.6/site-packages/sparkmagic/kernels/wrapperkernel/sparkkernelbase.py", line 71, in _load_magics_extension log_if_error="Failed to load the Spark kernels magics library.") File "/Users/yhuang/anaconda3/envs/jupyter_sparkmagic_py3.6/lib/python3.6/site-packages/sparkmagic/kernels/wrapperkernel/sparkkernelbase.py", line 102, in _execute_cell if shutdown_if_error and reply_content[u"status"] == u"error": TypeError: '_asyncio.Future' object is not subscriptable
Facing same issue
Traceback (most recent call last): File "/opt/conda/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/opt/conda/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/opt/conda/lib/python3.6/site-packages/sparkmagic/kernels/pyspark3kernel/pyspark3kernel.py", line 28, in <module> IPKernelApp.launch_instance(kernel_class=PySpark3Kernel) File "/opt/conda/lib/python3.6/site-packages/traitlets/config/application.py", line 657, in launch_instance app.initialize(argv) File "<decorator-gen-139>", line 2, in initialize File "/opt/conda/lib/python3.6/site-packages/traitlets/config/application.py", line 87, in catch_config_error return method(app, *args, **kwargs) File "/opt/conda/lib/python3.6/site-packages/ipykernel/kernelapp.py", line 484, in initialize self.init_kernel() File "/opt/conda/lib/python3.6/site-packages/ipykernel/kernelapp.py", line 389, in init_kernel user_ns=self.user_ns, File "/opt/conda/lib/python3.6/site-packages/traitlets/config/configurable.py", line 412, in instance inst = cls(*args, **kwargs) File "/opt/conda/lib/python3.6/site-packages/sparkmagic/kernels/pyspark3kernel/pyspark3kernel.py", line 23, in __init__ language_info, session_language, **kwargs) File "/opt/conda/lib/python3.6/site-packages/sparkmagic/kernels/wrapperkernel/sparkkernelbase.py", line 42, in __init__ self._load_magics_extension() File "/opt/conda/lib/python3.6/site-packages/sparkmagic/kernels/wrapperkernel/sparkkernelbase.py", line 71, in _load_magics_extension log_if_error="Failed to load the Spark kernels magics library.") File "/opt/conda/lib/python3.6/site-packages/sparkmagic/kernels/wrapperkernel/sparkkernelbase.py", line 102, in _execute_cell if shutdown_if_error and reply_content[u"status"] == u"error": TypeError: '_asyncio.Future' object is not subscriptable [W 2019-01-02 11:15:00.501 SingleUserNotebookApp restarter:100] KernelRestarter: restart failed [W 2019-01-02 11:15:00.502 SingleUserNotebookApp kernelmanager:127] Kernel 3479cb8a-f50b-4340-bcdc-fa647f440dbd died, removing from map.
I have this same issue on Linux too. Everything works fine with Livy when I test the commands with curl too
I'm facing the same issue on Mac, jupyter and sparkmagic installed with conda, sparkmagic version 0.12.6.
I'm unable to execute a simple println()
I think the issue is due to incompatibility with latest ipykernel which is installed via Conda. Any version of ipykernel above 4.10 should have issue with latest sparkmagic installed. Change log for ipykernel for reference, https://github.com/ipython/ipykernel/blob/master/docs/changelog.rst
As a workaround till the issue is fixed, I tried downgrading ipykernel which works.
pip install ipykernel==4.9.0
Thanks, @cpranav you are right using ipykernel v 4.9.0 fixes the issue.
I wonder if this is going to be addressed properly for spark magic?
@Ohtar10 PR #508 addresses this
Would appreciate some testing of https://github.com/jupyter-incubator/sparkmagic/pull/542, based on @jaipreet-s 's patch but with support for Python 2.
I'm seeing this issue on Python 3.6 with sparkmagic version 0.16, and ipykernel version 5.1.4. Any clue why? Seems like it should be solved.
Downgrading my ipykernel does indeed fix it.
I think the issue is due to incompatibility with latest ipykernel which is installed via Conda. Any version of ipykernel above 4.10 should have issue with latest sparkmagic installed. Change log for ipykernel for reference, https://github.com/ipython/ipykernel/blob/master/docs/changelog.rst
As a workaround till the issue is fixed, I tried downgrading ipykernel which works.
pip install ipykernel==4.9.0
lost sleep on this . .this HELPED a lot :) Now using the below config
papermill==2.3.3
jupyter==1.0.0
sparkmagic==0.19.0
# need to stick to this due to sparkmagic incompatibility https://github.com/jupyter-incubator/sparkmagic/issues/492
ipykernel==5.1.4
On python 3.9.6, downgrading ipkernel to 5.5.4 from 6.0.3 solved this error for me:
Traceback (most recent call last):
File "/usr/local/lib/python3.9/runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/local/lib/python3.9/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/usr/local/lib/python3.9/site-packages/sparkmagic/kernels/pysparkkernel/pysparkkernel.py", line 34, in <module>
IPKernelApp.launch_instance(kernel_class=PySparkKernel)
File "/usr/local/lib/python3.9/site-packages/traitlets/config/application.py", line 844, in launch_instance
app.initialize(argv)
File "/usr/local/lib/python3.9/site-packages/traitlets/config/application.py", line 87, in inner
return method(app, *args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/ipykernel/kernelapp.py", line 637, in initialize
self.init_kernel()
File "/usr/local/lib/python3.9/site-packages/ipykernel/kernelapp.py", line 489, in init_kernel
kernel = kernel_factory(parent=self, session=self.session,
File "/usr/local/lib/python3.9/site-packages/traitlets/config/configurable.py", line 537, in instance
inst = cls(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/sparkmagic/kernels/pysparkkernel/pysparkkernel.py", line 26, in __init__
super(PySparkKernel,
File "/usr/local/lib/python3.9/site-packages/sparkmagic/kernels/wrapperkernel/sparkkernelbase.py", line 47, in __init__
self._load_magics_extension()
File "/usr/local/lib/python3.9/site-packages/sparkmagic/kernels/wrapperkernel/sparkkernelbase.py", line 75, in _load_magics_extension
self._execute_cell(register_magics_code, True, False, shutdown_if_error=True,
File "/usr/local/lib/python3.9/site-packages/sparkmagic/kernels/wrapperkernel/sparkkernelbase.py", line 107, in _execute_cell
if shutdown_if_error and reply_content[u"status"] == u"error":
TypeError: 'coroutine' object is not subscriptable
Here's my pip freeze output:
anyio==3.3.0
argon2-cffi==20.1.0
async-generator==1.10
attrs==21.2.0
autovizwidget==0.19.0
Babel==2.9.1
backcall==0.2.0
backports.entry-points-selectable==1.1.0
bleach==3.3.1
certifi==2021.5.30
cffi==1.14.6
charset-normalizer==2.0.3
colorama==0.4.4
cryptography==3.4.7
debugpy==1.4.1
decorator==5.0.9
defusedxml==0.7.1
distlib==0.3.2
entrypoints==0.3
filelock==3.0.12
gitdb==4.0.7
GitPython==3.1.19
hdijupyterutils==0.19.0
idna==3.2
ipykernel==6.0.3
ipython==7.25.0
ipython-genutils==0.2.0
ipywidgets==7.6.3
jedi==0.18.0
Jinja2==3.0.1
json5==0.9.6
jsonschema==3.2.0
jupyter==1.0.0
jupyter-client==6.2.0
jupyter-console==6.4.0
jupyter-core==4.7.1
jupyter-server==1.10.1
jupyter-server-mathjax==0.2.3
jupyterlab==3.1.0
jupyterlab-git==0.31.0
jupyterlab-pygments==0.1.2
jupyterlab-server==2.6.1
jupyterlab-widgets==1.0.0
MarkupSafe==2.0.1
matplotlib-inline==0.1.2
mistune==0.8.4
mock==4.0.3
nbclassic==0.3.1
nbclient==0.5.3
nbconvert==6.1.0
nbdime==3.1.0
nbformat==5.1.3
nest-asyncio==1.5.1
nose==1.3.7
notebook==6.4.0
numpy==1.21.1
packaging==21.0
pandas==1.3.1
pandocfilters==1.4.3
parso==0.8.2
pexpect==4.8.0
pickleshare==0.7.5
platformdirs==2.0.2
plotly==5.1.0
prometheus-client==0.11.0
prompt-toolkit==3.0.19
ptyprocess==0.7.0
pycparser==2.20
Pygments==2.9.0
pykerberos==1.2.1
pyparsing==2.4.7
pyrsistent==0.18.0
python-dateutil==2.8.2
pytz==2021.1
pyzmq==22.1.0
qtconsole==5.1.1
QtPy==1.9.0
requests==2.26.0
requests-kerberos==0.12.0
requests-unixsocket==0.2.0
Send2Trash==1.7.1
six==1.16.0
smmap==4.0.0
sniffio==1.2.0
sparkmagic==0.19.0
tenacity==8.0.1
terminado==0.10.1
testpath==0.5.0
tornado==6.1
traitlets==5.0.5
typing-extensions==3.10.0.0
urllib3==1.26.6
virtualenv==20.6.0
wcwidth==0.2.5
webencodings==0.5.1
websocket-client==1.1.0
widgetsnbextension==3.5.1
I ran into this issue when trying to install sparkmagic
in base-notebook
image from https://github.com/jupyter/docker-stacks and had to use a very old version of the image to make it work.
Is anyone looking at addressing the root cause of the error? Pinning the ipykernel
is not a long-term solution.
Hi @sergiimk thanks for letting us know and sorry to hear you are running into this issue. The long-term solution is this PR; however, it's moving slowly because the changes are significant. Please feel free to comment on the PR to get the reviewer's attention. Hopefully, we can get it merged soon!
I am having the following issue with version 0.12.6 of sparkmagic and pyspark kernels.