While I was going through hello_world.ipynb, I noticed this error ValueError: Cannot run multiple SparkContexts at once. It is a pretty common error that occurs because the system automatically initializes the SparkContex.
I had to use sc.stop() to stop the earlier context and create a new one. @birdsarah Should I maybe add a cell just after this code snippet
import findspark findspark.init('/opt/spark')# Adjust for the location where you installed sparkfrom pyspark import SparkContext from pyspark.sql import SparkSession sc = SparkContext(appName="Overscripted") spark = SparkSession(sc)
#If you are already running a context. run this cell and rerun the cell abovesc.stop()
While I was going through hello_world.ipynb, I noticed this error
ValueError: Cannot run multiple SparkContexts at once
. It is a pretty common error that occurs because the system automatically initializes the SparkContex.I had to use
sc.stop()
to stop the earlier context and create a new one. @birdsarah Should I maybe add a cell just after this code snippetimport findspark findspark.init('/opt/spark')
# Adjust for the location where you installed spark
from pyspark import SparkContext from pyspark.sql import SparkSession sc = SparkContext(appName="Overscripted") spark = SparkSession(sc)
#If you are already running a context. run this cell and rerun the cell above
sc.stop()