I usually work in an iPython shell instead of a PySpark shell, so I have to manually create the context. We'll need such a function once we're launching real Spark jobs anyway.
The idea is that we can just do something like this from any Python shell:
>>> import elizabeth
>>> ctx = elizabeth.context()
I made a helper for creating a SparkContext.
I usually work in an iPython shell instead of a PySpark shell, so I have to manually create the context. We'll need such a function once we're launching real Spark jobs anyway.
The idea is that we can just do something like this from any Python shell: