Closed stoan closed 3 years ago
Hi,
It is not possible to directly specify this parameter now. But you can provide any parameters supported by SparkConf as spark_options
in config. See https://github.com/malexer/pytest-spark#overriding-default-parameters-of-the-spark_session-fixture
Unfortunately, it seems for "--driver-class-path" this way will not work according to the docs: https://spark.apache.org/docs/latest/configuration.html#runtime-environment
Check also parameter spark.jars.packages
- may this can solve your issue.
Thank you @malexer 😇🙏🏿
Is it possible to pass the --driver-class-path option to Pyspark? used for referring jars(scala code)