Closed richwhitjr closed 7 years ago
It is doable but difficult to call the implicit classes from PySpark. Instead this pulls out the main logic into two new classes and extends those with the implicit classes.
Tested in both a local Spark standalone for pyspark and spark-shell.
Makes sense, little bit cleaner compared to implicit class that just extends the base class.
It is doable but difficult to call the implicit classes from PySpark. Instead this pulls out the main logic into two new classes and extends those with the implicit classes.
Tested in both a local Spark standalone for pyspark and spark-shell.