Open asolis opened 2 years ago
It is possible to have all dependencies in one file (pyproject.toml) and all tools to read from that file, but it requires a separate tool to do so. That said, #37 also introduces new tools (though more mature ones, from what I can gather).
Good question. The only path I could find so far was to sync from the conda environment. Unfortunately the conda environment file format doesn’t allow macros or pre-processing include or imports kind of definition. I think the place to include these are in the setup.cfg not the .toml from all examples I have found. Should they be in sync too? I believe, the conda env is the one that should be handled by the developer and manual manipulate it. At least that would be the natural way to do it. The idea is try to capture that definition and include it in the package requirements. This way if package is published or shared , there’s no need of the conda env to satisfy dependencies , the pip installation process would take care of it .
freezing conda pips seems to be the first option given the fact that we actually need pip packages listed as required.
Setuptools can actually read deps from pyproject.toml as part of its PEP 621 support: https://setuptools.pypa.io/en/latest/userguide/pyproject_config.html. Longer term (i.e. longer than we care about now) this may replace setup.cfg entirely. The outlier is conda, so as things stand we still need some bridge between it and the rest of the python packaging infrastructure.
Does it make sense to maybe think about this in the other direction? There seems to be support for generating a conda lock file or an environment.yml
file from pyproject.toml
.
If we're pushing people towards creating a python package, but not a conda package, does it make sense to direct them towards the pyproject.toml
as the source of truth and augment that with environment files?
That was the thought. I also hadn't heard of beni, thanks for sharing that. I've had some success playing around with conda-lock
, but I can understand if there's some hesitancy to recommend either library because they're considered too immature at this time.
I tried running beni
against the pyproject.toml
from #46 and it seems to work pretty well. It will mean adding it as a dependency and command in the Makefile
. Unfortunately, it doesn't seem to be intended as something that is run from other python code, making it difficult to integrate as a cookiecutter plugin.
Sample project output:
channels:
- conda-forge
dependencies:
- pip:
- flit
- python>=3.9
- pip
- fastapi
- typer
- pytest
- requests
- sphinx
- sphinx-intl
- myst-parser
- sphinx-autoapi
name: sample
The src_pyproject
from the conda incubator seems to have been removed (link is now broken). I found what looks like a successor in conda project. To me this is the long term solution, but unfortunately it isn't actually released yet.
After discussion with @asolis the overall goal is to support conda
environments on a fairly equal footing to Python's venv
. This can be either through environment.yml
or through conda packages (meta.yml
file).
Setting up/deploying in a container should be the test of acceptable solutions, be that conda install <package>
or conda env create -f <environment.yml>
.
I started to look a bit about this issue but I have found a couple of blocks. I think it would be nice to continue the initial brainstorm with @goatsweater in an issue, so we could track ideas and possible solution(s). @ToucheSir It will be nice to get your feedback too.
Context: I believe that we are using a great development tool for managing virtual environment and package management (i.e., Conda). Unfortunately, setup.cfg
install_requires
configuration option doesn't understand Conda packages, only pips. Once our package is published or a wheel is created, ifinstall_requires
hasn't being declared, package dependencies are not managed when installing our package. That means that we don't have the dependencies needed to run our code solved when installing our package. The environment definition is part of the source code and repository but not distributed nor managed by the pip package management process.Current Scenario: After development our package and defining our virtual environment and dependencies using conda, a manual step needs to generate and validate the install requirement dependencies for our package, translate them to pip packages (in the event that they were conda packages) and include them in the setup.cfg file.
Cons: Manually creating a mirror of dependencies appears to be a burden that could be simplified, hopefully automated, but at least some suggestion could be a good step in the right direction given the fact that we already defined our dev environment.
Ideas: We would like to have the
install_requires
configuration automatically generated or maybeassisted
from our conda environment definitions. We could use our default environment.yml (2) file to recreate an initial list of pip packages to include as part of the setup.cfg configuration. Packages such as conda-minify sounds like a great alternative to compute the minimum set of dependencies in a conda environment.Maybe creating a Makefile rule could extract the current active environment (or the content of environment.yml file), and compute a list of pip packages to include inside the setup.cfg file. This could be later reviewed and published if correct.
Blocks: