cloudera / clusterdock

Apache License 2.0
70 stars 57 forks source link

getting error while i am trying to stop spark streaming #43

Open saiprakashreddy916 opened 4 years ago

saiprakashreddy916 commented 4 years ago

scala> sai@sai-HP-15-Notebook-PC:~$ spark-shell 20/11/05 21:59:13 WARN Utils: Your hostname, sai-HP-15-Notebook-PC resolves to a loopback address: 127.0.1.1; using 192.168.43.70 instead (on interface wlo1) 20/11/05 21:59:13 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 20/11/05 21:59:15 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Spark context Web UI available at http://192.168.43.70:4040 Spark context available as 'sc' (master = local[*], app id = local-1604593775632). Spark session available as 'spark'. Welcome to


 / __/__  ___ _____/ /__
_\ \/ _ \/ _ `/ __/  '_/

// ./_,// //_\ version 3.0.0-preview2 /_/

Using Scala version 2.12.10 (OpenJDK 64-Bit Server VM, Java 1.8.0_242) Type in expressions to have them evaluated. Type :help for more information.

scala> ssc.stop()

:24: error: not found: value ssc ssc.stop()