Open yugam1 opened 7 years ago
Do you have logs from the NotebookApp? Sounds like there may be something incorrect happening with how Apache Spark is started up
[I 10:38:38.759 NotebookApp] Accepting one-time-token-authenticated connection f
rom ::1
[I 10:39:37.282 NotebookApp] Creating new notebook in
[I 10:39:41.453 NotebookApp] Kernel started: e634dfd9-9c9e-4024-94df-519fbfc874b
5
[I 10:39:46.866 NotebookApp] Adapting to protocol v5.1 for kernel e634dfd9-9c9e-
4024-94df-519fbfc874b5
[MetaKernelApp] ERROR | Exception in message handler:
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\lib\site-packages\ipykernel\kernelbase.py", lin
e 235, in dispatch_shell
handler(stream, idents, msg)
File "C:\ProgramData\Anaconda3\lib\site-packages\ipykernel\kernelbase.py", lin
e 399, in execute_request
user_expressions, allow_stdin)
File "C:\ProgramData\Anaconda3\lib\site-packages\metakernel_metakernel.py", l
ine 357, in do_execute
retval = self.do_execute_direct(code)
File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_kernel.py
", line 141, in do_execute_direct
res = self._scalamagic.eval(code.strip(), raw=False)
File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_magic.py"
, line 155, in eval
intp = self._get_scala_interpreter()
File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_magic.py"
, line 46, in _get_scala_interpreter
self._interp = get_scala_interpreter()
File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_interpret
er.py", line 562, in get_scala_interpreter
scala_intp = initialize_scala_interpreter()
File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_interpret
er.py", line 163, in initialize_scala_interpreter
spark_session, spark_jvm_helpers, spark_jvm_proc = init_spark()
File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_interpret
er.py", line 78, in init_spark
import pyspark.java_gateway
File "C:\ProgramData\spark-2.1.0-bin-hadoop2.7\python\pyspark__init.py", li
ne 44, in
from .result import TestResult
File "C:\ProgramData\Anaconda3\lib\unittest\result.py", line 7, in
So this is probably due to an issue with pyspark 2.1.0 under python 3.6. Try running it with python 3.5
for any code which i run on this kernel nothing happens just Intitializing Scala interpreter ... is written as output and code never completes