Closed vfrank66 closed 5 years ago
@vfrank66
In your first case you are specifying spark_home=spark/
which is not reliable, it's better to provide full path. Note: and we expect to find the dir with full spark installation, not the pyspark installed via pip.
But the core issue is - you have already pyspark
installed via pip/pipenv, so pyspark
is importable. pytest-spark uses this variable to find your spark dir and make pyspark importable in your tests.
So, no need to define spark_home
neither in pytest.ini nor as --spark_home
param to pytest nor as SPARK_HOME env variable. That should fix your issue.
Your second case worked because according to:
plugins: cov-2.7.1, mock-1.10.4
plugin pytest-spark
was not discovered for some reason. Are you sure it was indeed installed into correct environment?
Anyway thanks for the interesting case, I will try to cover it in the readme.
Updated readme a5ce0a906c67bf535d8d6a5dfd295bcd814507a4
When using pipenv installed through pip the hooks fire findspark.py before the configuration is read. When using
brew install pipenv
and the hooks fire in the right order. This might not be the right place for this issue, except that if I remove this package and set the spark_home through pytest.ini underenv
it does work.pytest.ini
Error
Works:
pipenv uninstall pytest.spark
Also works, through brew and with pytest-spark:
Produces