Open mikarubi opened 2 months ago
yeah, I've been struggling with that choice, particularly in the case of pyspark
, which might work with multiple versions, but I believe it should match the version of Spark
installed in the system.
I agree we should not have it pinned like that, so we should probably it remove from the requirements and include instructions to install it separately. I don't think we can automatically test the spark version at pip install
step and dynamically choose the pyspark version... but I'll look into it
actually we might be able to do it within setup.py
I believe that pyspark automatically comes with spark, so we don't need to worry about system installations: https://stackoverflow.com/a/51729469/5427308
oh, that's interesting! it would definitely make installation simpler. However, we might have conflicts of versions in the case multiple Spark installations are in the path, I got this problem myself.
apparently only pip installing pyspark is sufficient, the tests are passing like this now! we still might find the issue of pre-installed spark in the system path, though
great! hopefully this should be robust and we won't run into problems. since we got rid of spark as a dependency, it makes sense to do same for ants, since it has a simple wrapper in python. https://pypi.org/project/antspyx/
it should be straightforward to convert the corresponding ants calls into python. i can give it a shot, and the advantage would be a package that can be fully installed via pip without need to install other software.
yes, great idea with ANTs python package as well!
The
==
in requirements should probably be relaxed to>=
in most cases (since==
will be very rigid/inflexible outside of a container or virtual environment). However, we should check that>=
will not break, at least for a default current conda configuration. Alternatively, we can not enforce the requirements except in the case of generating containers/virtual environments.