Hello, I'm getting the following as soon as I use the browser to connect to jupyterlab. On the browser side I loose the pyspark kernel.
I use it in a Kubernetes environment with the following pods:
error: json.decoder.JSONDecodeError: Expecting ':' delimiter: line 5 column 64
Complete Error:
[D 2023-04-23 19:14:37.265 ServerApp] 200 GET /jupyterlab/api/contents?content=1& 1682277274920 (c9d4dc2ad1c74e129a7c34169647ddfa@127.0.0.1) 1.00ms Traceback (most recent call last): File "/opt/conda/lib/python3.10/runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "/opt/conda/lib/python3.10/runpy.py", line 86, in _run_code exec(code, run_globals) File "/opt/conda/lib/python3.10/site-packages/sparkmagic/kernels/pysparkkernel/ pysparkkernel.py", line 37, in <module> IPKernelApp.launch_instance(kernel_class=PySparkKernel) File "/opt/conda/lib/python3.10/site-packages/traitlets/config/application.py", line 1040, in launch_instance app.initialize(argv) File "/opt/conda/lib/python3.10/site-packages/traitlets/config/application.py", line 113, in inner return method(app, *args, **kwargs) File "/opt/conda/lib/python3.10/site-packages/ipykernel/kernelapp.py", line 692 , in initialize self.init_kernel() File "/opt/conda/lib/python3.10/site-packages/ipykernel/kernelapp.py", line 540 , in init_kernel kernel = kernel_factory( File "/opt/conda/lib/python3.10/site-packages/traitlets/config/configurable.py" , line 551, in instance inst = cls(*args, **kwargs) File "/opt/conda/lib/python3.10/site-packages/sparkmagic/kernels/pysparkkernel/ pysparkkernel.py", line 23, in __init__ super(PySparkKernel, self).__init__( File "/opt/conda/lib/python3.10/site-packages/sparkmagic/kernels/wrapperkernel/ sparkkernelbase.py", line 99, in __init__ self.logger = SparkLog("{}_jupyter_kernel".format(self.session_language)) File "/opt/conda/lib/python3.10/site-packages/sparkmagic/utils/sparklogger.py", line 11, in __init__ MAGICS_LOGGER_NAME, conf.logging_config(), class_name File "/opt/conda/lib/python3.10/site-packages/hdijupyterutils/configuration.py" , line 18, in wrapped_f _initialize(overrides, path, fsrw_class) File "/opt/conda/lib/python3.10/site-packages/hdijupyterutils/configuration.py" , line 52, in _initialize new_overrides = _load(path, fsrw_class) File "/opt/conda/lib/python3.10/site-packages/hdijupyterutils/configuration.py" , line 70, in _load overrides = json.loads(line) File "/opt/conda/lib/python3.10/json/__init__.py", line 346, in loads return _default_decoder.decode(s) File "/opt/conda/lib/python3.10/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/opt/conda/lib/python3.10/json/decoder.py", line 353, in raw_decode obj, end = self.scan_once(s, idx) json.decoder.JSONDecodeError: Expecting ':' delimiter: line 5 column 64 (char 139 ) [I 2023-04-23 19:14:39.639 ServerApp] AsyncIOLoopKernelRestarter: restarting kern el (1/5), new random ports [D 2023-04-23 19:14:39.650 ServerApp] Starting kernel: ['/opt/conda/bin/python3.1 0', '-m', 'sparkmagic.kernels.pysparkkernel.pysparkkernel', '-f', '/home/jovyan/. local/share/jupyter/runtime/kernel-499e6f69-66d7-407f-8da5-b8a39dc47f86.json'] [D 2023-04-23 19:14:39.650 ServerApp] Connecting to: tcp://127.0.0.1:60439
Jupyter yaml below (sorry, it won't show nicely even with code enclosure). Made a few changes just for debugging the error (like running as su. What is the root password anyway?).
Hello, I'm getting the following as soon as I use the browser to connect to jupyterlab. On the browser side I loose the pyspark kernel. I use it in a Kubernetes environment with the following pods: error:
json.decoder.JSONDecodeError: Expecting ':' delimiter: line 5 column 64
Complete Error:
[D 2023-04-23 19:14:37.265 ServerApp] 200 GET /jupyterlab/api/contents?content=1& 1682277274920 (c9d4dc2ad1c74e129a7c34169647ddfa@127.0.0.1) 1.00ms Traceback (most recent call last): File "/opt/conda/lib/python3.10/runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "/opt/conda/lib/python3.10/runpy.py", line 86, in _run_code exec(code, run_globals) File "/opt/conda/lib/python3.10/site-packages/sparkmagic/kernels/pysparkkernel/ pysparkkernel.py", line 37, in <module> IPKernelApp.launch_instance(kernel_class=PySparkKernel) File "/opt/conda/lib/python3.10/site-packages/traitlets/config/application.py", line 1040, in launch_instance app.initialize(argv) File "/opt/conda/lib/python3.10/site-packages/traitlets/config/application.py", line 113, in inner return method(app, *args, **kwargs) File "/opt/conda/lib/python3.10/site-packages/ipykernel/kernelapp.py", line 692 , in initialize self.init_kernel() File "/opt/conda/lib/python3.10/site-packages/ipykernel/kernelapp.py", line 540 , in init_kernel kernel = kernel_factory( File "/opt/conda/lib/python3.10/site-packages/traitlets/config/configurable.py" , line 551, in instance inst = cls(*args, **kwargs) File "/opt/conda/lib/python3.10/site-packages/sparkmagic/kernels/pysparkkernel/ pysparkkernel.py", line 23, in __init__ super(PySparkKernel, self).__init__( File "/opt/conda/lib/python3.10/site-packages/sparkmagic/kernels/wrapperkernel/ sparkkernelbase.py", line 99, in __init__ self.logger = SparkLog("{}_jupyter_kernel".format(self.session_language)) File "/opt/conda/lib/python3.10/site-packages/sparkmagic/utils/sparklogger.py", line 11, in __init__ MAGICS_LOGGER_NAME, conf.logging_config(), class_name File "/opt/conda/lib/python3.10/site-packages/hdijupyterutils/configuration.py" , line 18, in wrapped_f _initialize(overrides, path, fsrw_class) File "/opt/conda/lib/python3.10/site-packages/hdijupyterutils/configuration.py" , line 52, in _initialize new_overrides = _load(path, fsrw_class) File "/opt/conda/lib/python3.10/site-packages/hdijupyterutils/configuration.py" , line 70, in _load overrides = json.loads(line) File "/opt/conda/lib/python3.10/json/__init__.py", line 346, in loads return _default_decoder.decode(s) File "/opt/conda/lib/python3.10/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/opt/conda/lib/python3.10/json/decoder.py", line 353, in raw_decode obj, end = self.scan_once(s, idx) json.decoder.JSONDecodeError: Expecting ':' delimiter: line 5 column 64 (char 139 ) [I 2023-04-23 19:14:39.639 ServerApp] AsyncIOLoopKernelRestarter: restarting kern el (1/5), new random ports [D 2023-04-23 19:14:39.650 ServerApp] Starting kernel: ['/opt/conda/bin/python3.1 0', '-m', 'sparkmagic.kernels.pysparkkernel.pysparkkernel', '-f', '/home/jovyan/. local/share/jupyter/runtime/kernel-499e6f69-66d7-407f-8da5-b8a39dc47f86.json'] [D 2023-04-23 19:14:39.650 ServerApp] Connecting to: tcp://127.0.0.1:60439
Jupyter yaml below (sorry, it won't show nicely even with code enclosure). Made a few changes just for debugging the error (like running as su. What is the
root
password anyway?).`apiVersion: v1 kind: ServiceAccount metadata: name: jupyterlab namespace: test-jh
kind: Role apiVersion: rbac.authorization.k8s.io/v1 metadata: name: jupyterlab namespace: test-jh rules:
apiGroups: [ "" ] resources: [ "pods", "services", "configmaps", "pods/log" ] verbs: [ "*" ]
kind: RoleBinding apiVersion: rbac.authorization.k8s.io/v1 metadata: name: jupyterlab namespace: test-jh subjects:
kind: ServiceAccount name: jupyterlab namespace: test-jh roleRef: kind: Role name: jupyterlab apiGroup: rbac.authorization.k8s.io
apiVersion: v1 kind: Service metadata: name: jupyterlab namespace: test-jh labels: run: jupyterlab spec: ports:
port: 8888 protocol: TCP selector: run: jupyterlab
apiVersion: apps/v1 kind: Deployment metadata: namespace: test-jh name: jupyterlab spec: selector: matchLabels: run: jupyterlab replicas: 1 strategy: rollingUpdate: maxUnavailable: 0 maxSurge: 1 template: metadata: labels: run: jupyterlab spec:
affinity:
nodeAffinity:
requiredDuringSchedulingIgnoredDuringExecution:
nodeSelectorTerms:
- matchExpressions:
- key: service
operator: In
values:
- gke-shared
- key: cloud.google.com/gke-spot
operator: Exists #DoesNotExist
containers:
image: jupyter/base-notebook:lab-3.5.2 name: jupyterlab resources: requests: cpu: "0.25" memory: "2048Mi" ports:
allowPrivilegeEscalation: true
readOnlyRootFilesystem: false
runAsNonRoot: true
volumeMounts:
name: quickstart configMap: name: quickstart
apiVersion: v1 kind: ConfigMap metadata: name: sparkmagic namespace: test-jh data: sparkmagic: | { "kernel_python_credentials" : { "username": "", "password": "", "http://lighter.test-jh.svc.cluster.local:8080/lighter/api", "auth": "None" }, "livy_session_startup_timeout_seconds": 600, "custom_headers": { "X-Compatibility-Mode": "sparkmagic" } }
apiVersion: v1 kind: ConfigMap metadata: name: quickstart namespace: test-jh data: quickstart: | { "cells": [ { "cell_type": "code", "execution_count": null, "id": "c72ef05c-13ee-4199-a002-b2be815ce419", "metadata": {}, "outputs": [], "source": [ "%%configure -f\n", "{\n", " \"name\": \"Test Lighter\",\n", " \"conf\":{\n", " \"spark.kubernetes.container.image\": \"apache/spark-py:v3.3.2\"\n", " }\n", "}" ] }, { "cell_type": "code", "execution_count": null, "id": "94d68783-5c86-440e-84da-aa8be288cdd5", "metadata": {}, "outputs": [], "source": [ "df = spark.createDataFrame(\n", " [\n", " (1, \"foo\"),\n", " (2, \"bar\"),\n", " ],\n", " [\"id\", \"label\"]\n", ")\n", "df.collect()" ] } ], "metadata": { "kernelspec": { "display_name": "PySpark", "language": "python", "name": "pysparkkernel" }, "language_info": { "codemirror_mode": { "name": "python", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "pyspark", "pygments_lexer": "python3" } }, "nbformat": 4, "nbformat_minor": 5 }`