tox allows to define a test matrix (python versions, environments, test commands, etc). The test matrix can be used to test packages (end user tests) and for 'current environment' (developer) tests. Currently pyctdev uses tox.ini to get the test matrix, and (hackily) uses features such as conda build's recipe append and clobber to test conda packages in a matrix, thus simulating tox-like behavior with conda build.
[x] pyctdev support (uses tox for ecosystem=python, reads tox.ini to launch conda build jobs for ecosystem=conda)
[ ] conda issue
more details: pyviz-dev/pyct#53
namespaces
Python package author probably wants to declare dependencies using pypi names. But those don't necessarily match conda names. Meanwhile, package names on defaults and conda-forge are inconsistent.
[x] pyctdev support (partial, ...)
[ ] conda issue
more details: pyviz-dev/pyctdev#27
abstract vs concrete dependencies
As a library developer, want to be able to declare "abstract" dependencies once, and not have to maintain multiple lists (e.g. for pip, for conda, ...). However, also very common for "applications" or even for higher level libraries to want to also supply an environment to users that's (almost) guaranteed to work (which might involve pinning versions and even channels).
[x] pyctdev support (allows to express the abstract dependencies in setup.py/setup.cfg, plus also a list of pins that can optionally be applied when building a package or generating an environment file)
[ ] conda issue
more details: pyviz-dev/pyctdev#43
developer install
With pip, one can do e.g. pip install -e . as a developer to get the latest dependencies as declared in setup.py/setup.cfg, plus your package 'develop installed'. With conda, people seem to do a variety of things (e.g. conda install pkg && conda remove --force pkg && python setup.py develop - getting last release's dependencies, then possibly having some installed/upgraded by pip; or conda install x y z && pip install -e . manually reading x y z from somewhere e.g. meta.yaml; or conda build conda.recipe && conda activate (path to the build env) && pip install -e . ...).
If using pip install -e or similar, need to specify --no-deps to avoid pip or easy_install trying to install dependencies. (Also, if package has entry points, they may fail to work because the dependencies are checked when you try to run...pkg_resources.DistributionNotFound is a common exception to see after pip install -e of package that's had dependencies installed by conda.
[x] pyctdev support (supports develop_install, which for pip does pip install -e .[tests], and for conda does conda install (required, tests) followed by pip install -e . --no-deps)
[ ] conda issue
more details:
extras
In setup.py/setup.cfg, can declare things like extras_require['recommended']=[a,b,c] or extras_require['geo'] or whatever. These are (or at least were...) shown as options on pypi, to aid discovery. For conda, have to make multiple packages. Not even any conventions (e.g. package + package-core? what about suggests, recommends, etc?).
[x] pyctdev support (declare map of package names to extras_require groups in setup.cfg; pyctdev uses to generate conda recipes and build the packages)
Binary incompatibility between defaults and conda-forge is often a cause of problems (and often not obvious at first), and mixing channels by mistake is easy even when trying not to. Binary compatibility can even be a problem within a single channel.
Even when I as a package author know about incompatible packages, there's nothing I can do to stop users from getting incompatible packages except by requiring the user to run an exact set of "install channel::package==version=build" commands in a fresh environment, or ask them to have a pins file in place on their system.
[ ] pyctdev support
[ ] conda issue
more details: pyviz-dev/pyct#33
Be able to create environment from remote file
From a pyviz issue...
Also: we currently produce only a highly pinned conda package. That's ok, because pyviz isn't a library that's designed to be integrated into someone else's package (that's what the libraries underlying pyviz are for). But a highly pinned conda package can be difficult to install, except in a dedicated environment. By which point, a "conda env create (some file env file on anaconda.org or maybe github)" command might make more sense...?
inspecting dependencies
Would be very useful to see the dependency graph of installed package(s) or environment, and of package(s) on anaconda.org.
[ ] pyctdev support (generates text or svg dependency graph. installed environment: yes; installed package: not yet; package on anaconda.org: not yet)
[ ] conda issue
more details:
misc conda inconsistencies
Support for channel pins:
conda build: no (correct/expected)
conda install: yes
conda create: yes
conda env create: no (except via separate, user pin file?)
Support for selectors:
conda build: yes
conda install: no (correct/expected)
conda create: no
conda env create: no
Support for optional constraints:
conda build: yes
conda install: no (except via separate, user pin file?)
conda create: no (except via separate, user pin file?)
conda env create: no (except via separate, user pin file?)
Also:
conda skeleton...environment markers didn't seem to be supported.
"single command install"/support for remote files? E.g. with conda install, you can do a 'one command' installation (e.g. conda install -c pyviz pyviz), whereas conda (env) create requires that someone fetches the environment file first (i.e. you can't conda env create (environment file on anaconda.org or maybe github) ).
To do:
[ ] conda issue(s?)
other duplicated packaging metadata
Apart from multiple lists of dependencies, many packages have a description that says one thing on pypi, something different on anaconda.org defaults, and yet again something different on their own channel and/or conda-forge. Can't python package authors just declare it once and have that apply everywhere? conda build can read from setup.py/setup.cfg, except for the package name; conda-forge requires its own entirely separate data (?). Maybe that's ok and there's nothing to do here (e.g. think of all the near-duplicated packaging metadata in all the linux distros).
(WIP)
test matrix
tox allows to define a test matrix (python versions, environments, test commands, etc). The test matrix can be used to test packages (end user tests) and for 'current environment' (developer) tests. Currently pyctdev uses tox.ini to get the test matrix, and (hackily) uses features such as conda build's recipe append and clobber to test conda packages in a matrix, thus simulating tox-like behavior with conda build.
namespaces
Python package author probably wants to declare dependencies using pypi names. But those don't necessarily match conda names. Meanwhile, package names on defaults and conda-forge are inconsistent.
abstract vs concrete dependencies
As a library developer, want to be able to declare "abstract" dependencies once, and not have to maintain multiple lists (e.g. for pip, for conda, ...). However, also very common for "applications" or even for higher level libraries to want to also supply an environment to users that's (almost) guaranteed to work (which might involve pinning versions and even channels).
developer install
With pip, one can do e.g.
pip install -e .
as a developer to get the latest dependencies as declared in setup.py/setup.cfg, plus your package 'develop installed'. With conda, people seem to do a variety of things (e.g.conda install pkg && conda remove --force pkg && python setup.py develop
- getting last release's dependencies, then possibly having some installed/upgraded by pip; orconda install x y z && pip install -e .
manually reading x y z from somewhere e.g. meta.yaml; orconda build conda.recipe && conda activate (path to the build env) && pip install -e .
...).If using pip install -e or similar, need to specify --no-deps to avoid pip or easy_install trying to install dependencies. (Also, if package has entry points, they may fail to work because the dependencies are checked when you try to run...pkg_resources.DistributionNotFound is a common exception to see after pip install -e of package that's had dependencies installed by conda.
extras
In setup.py/setup.cfg, can declare things like extras_require['recommended']=[a,b,c] or extras_require['geo'] or whatever. These are (or at least were...) shown as options on pypi, to aid discovery. For conda, have to make multiple packages. Not even any conventions (e.g. package + package-core? what about suggests, recommends, etc?).
binary compatibility
Binary incompatibility between defaults and conda-forge is often a cause of problems (and often not obvious at first), and mixing channels by mistake is easy even when trying not to. Binary compatibility can even be a problem within a single channel.
Even when I as a package author know about incompatible packages, there's nothing I can do to stop users from getting incompatible packages except by requiring the user to run an exact set of "install channel::package==version=build" commands in a fresh environment, or ask them to have a pins file in place on their system.
Be able to create environment from remote file
From a pyviz issue...
inspecting dependencies
Would be very useful to see the dependency graph of installed package(s) or environment, and of package(s) on anaconda.org.
misc conda inconsistencies
Support for channel pins:
Support for selectors:
Support for optional constraints:
Also:
conda env create (environment file on anaconda.org or maybe github)
).To do:
other duplicated packaging metadata
Apart from multiple lists of dependencies, many packages have a description that says one thing on pypi, something different on anaconda.org defaults, and yet again something different on their own channel and/or conda-forge. Can't python package authors just declare it once and have that apply everywhere? conda build can read from setup.py/setup.cfg, except for the package name; conda-forge requires its own entirely separate data (?). Maybe that's ok and there's nothing to do here (e.g. think of all the near-duplicated packaging metadata in all the linux distros).