Closed halldc closed 6 years ago
Should I be making these split packages
noarch
?
Not sure what you mean here.
They don't actually install any code themselves - just list the dependencies they need.
My understanding is that you need a noarch
version of airflow=1.10
, if that is possible, no?
Airflow itself can't be noarch
because it's requirements depend on the python version. I was talking about turning the extra packages (e.g. airflow-with-celery
) into noarch
packages. This can be done, but it doesn't solve the issue mentioned here (extras don't work with py >27).
Instead of using an exact pinning (which includes the build variant), I should be pinning using max_pin
. This stops the extra package from depending on the python version. Watch out for an incoming PR. 🎉
Installing vanilla airflow with py36 works just fine:
$ conda create -n test python=3.6.6 airflow=1.10
But installing airflow with extras fails:
It looks like
airflow-with-celery=1.10
has been pinned to the py27 version ofairflow=1.10
, but this should be independent of python version.@ocefpaf @marcelotrevisani @sodre Have you experienced this issue before? Should I be making these split packages
noarch
? They don't actually install any code themselves - just list the dependencies they need.Details about
conda
and system (conda info
):