vericast / spylon-kernel

Jupyter kernel for scala and spark
Other
187 stars 38 forks source link

Any code is not running #39

Open yugam1 opened 7 years ago

yugam1 commented 7 years ago

for any code which i run on this kernel nothing happens just Intitializing Scala interpreter ... is written as output and code never completes image

mariusvniekerk commented 7 years ago

Do you have logs from the NotebookApp? Sounds like there may be something incorrect happening with how Apache Spark is started up

yugam1 commented 7 years ago

[I 10:38:38.759 NotebookApp] Accepting one-time-token-authenticated connection f rom ::1 [I 10:39:37.282 NotebookApp] Creating new notebook in [I 10:39:41.453 NotebookApp] Kernel started: e634dfd9-9c9e-4024-94df-519fbfc874b 5 [I 10:39:46.866 NotebookApp] Adapting to protocol v5.1 for kernel e634dfd9-9c9e- 4024-94df-519fbfc874b5 [MetaKernelApp] ERROR | Exception in message handler: Traceback (most recent call last): File "C:\ProgramData\Anaconda3\lib\site-packages\ipykernel\kernelbase.py", lin e 235, in dispatch_shell handler(stream, idents, msg) File "C:\ProgramData\Anaconda3\lib\site-packages\ipykernel\kernelbase.py", lin e 399, in execute_request user_expressions, allow_stdin) File "C:\ProgramData\Anaconda3\lib\site-packages\metakernel_metakernel.py", l ine 357, in do_execute retval = self.do_execute_direct(code) File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_kernel.py ", line 141, in do_execute_direct res = self._scalamagic.eval(code.strip(), raw=False) File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_magic.py" , line 155, in eval intp = self._get_scala_interpreter() File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_magic.py" , line 46, in _get_scala_interpreter self._interp = get_scala_interpreter() File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_interpret er.py", line 562, in get_scala_interpreter scala_intp = initialize_scala_interpreter() File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_interpret er.py", line 163, in initialize_scala_interpreter spark_session, spark_jvm_helpers, spark_jvm_proc = init_spark() File "C:\ProgramData\Anaconda3\lib\site-packages\spylon_kernel\scala_interpret er.py", line 78, in init_spark import pyspark.java_gateway File "C:\ProgramData\spark-2.1.0-bin-hadoop2.7\python\pyspark__init.py", li ne 44, in from pyspark.context import SparkContext File "C:\ProgramData\spark-2.1.0-bin-hadoop2.7\python\pyspark\context.py", lin e 40, in from pyspark.rdd import RDD, _load_from_socket, ignore_unicode_prefix File "C:\ProgramData\spark-2.1.0-bin-hadoop2.7\python\pyspark\rdd.py", line 47 , in from pyspark.statcounter import StatCounter File "C:\ProgramData\spark-2.1.0-bin-hadoop2.7\python\pyspark\statcounter.py", line 24, in from numpy import maximum, minimum, sqrt File "C:\ProgramData\Anaconda3\lib\site-packages\numpy__init__.py", line 142, in from . import add_newdocs File "C:\ProgramData\Anaconda3\lib\site-packages\numpy\add_newdocs.py", line 1 3, in from numpy.lib import add_newdoc File "C:\ProgramData\Anaconda3\lib\site-packages\numpy\lib__init__.py", line 8, in from .type_check import * File "C:\ProgramData\Anaconda3\lib\site-packages\numpy\lib\type_check.py", lin e 11, in import numpy.core.numeric as _nx File "C:\ProgramData\Anaconda3\lib\site-packages\numpy\core\init.py", line 72, in from numpy.testing.nosetester import _numpy_tester File "C:\ProgramData\Anaconda3\lib\site-packages\numpy\testing\init.py", l ine 10, in from unittest import TestCase File "C:\ProgramData\Anaconda3\lib\unittest\init__.py", line 58, in

from .result import TestResult

File "C:\ProgramData\Anaconda3\lib\unittest\result.py", line 7, in from . import util File "C:\ProgramData\Anaconda3\lib\unittest\util.py", line 119, in _Mismatch = namedtuple('Mismatch', 'actual expected value') File "C:\ProgramData\spark-2.1.0-bin-hadoop2.7\python\pyspark\serializers.py", line 393, in namedtuple cls = _old_namedtuple(*args, **kwargs) TypeError: namedtuple() missing 3 required keyword-only arguments: 'verbose', 'r ename', and 'module' [MetaKernelApp] ERROR | No such comm target registered: jupyter.widget.version [MetaKernelApp] ERROR | No such comm target registered: `jupyter.widget.version```

mariusvniekerk commented 7 years ago

So this is probably due to an issue with pyspark 2.1.0 under python 3.6. Try running it with python 3.5