Open amrishan opened 3 years ago
if they are not class methods then the method would be invoked for every test and a session would be created for each of those tests.
`class PySparkTest(unittest.TestCase): @classmethod def suppress_py4j_logging(cls): logger = logging.getLogger('py4j') logger.setLevel(logging.WARN)
@classmethod def create_testing_pyspark_session(cls): return SparkSession \ .builder \ .master('local[*]') \ .appName("my-local-testing-pyspark-context") \ .getOrCreate() @classmethod def setUpClass(cls): cls.suppress_py4j_logging() cls.spark = cls.create_testing_pyspark_session() cls.test_data_path = "<PATH>" cls.df = cls.spark.read.options(header='true', inferSchema='true') \ .csv(cls.test_data_path) cls.df_exepcted = transform_data(cls.df, cls.spark) @classmethod def tearDownClass(cls): cls.spark.stop()`
Well spotted - thank you 👍
if they are not class methods then the method would be invoked for every test and a session would be created for each of those tests.
`class PySparkTest(unittest.TestCase): @classmethod def suppress_py4j_logging(cls): logger = logging.getLogger('py4j') logger.setLevel(logging.WARN)