Open tdejager opened 6 months ago
I find 1. a bit verbose if you need to add it to every environment.
If we find a good way to incorporate the host dependencies with pixi build
, I would be in favor of that.
My use cases (use conda packages for everything, only use uv
for doing the editable install of .
) could look as follows:
Building a library:
[host-dependencies] # or [pypi-build-environment]
python = "*"
hatchling = "*"
[pypi-dependencies]
polarify = {path = ".", editable = true, ignore-dependencies = true, build-isolation = false}
[environments]
default = ["test"]
pl014 = ["pl014", "py39", "test"]
pl015 = ["pl015", "py39", "test"]
# ...
here, ignore-dependencies
results in the python interpreter not needing to be installed during solve time (dependencies of .
are not added to the lockfile anyway)
For building the wheel, we use a separate host environment which contains only python
and hatchling
. hatchling
is also not contained in the default
, pl014
, ... environments anymore (different (but imo better) to how it's working now)
Building an application:
[host-dependencies] # or [pypi-build-environment]
python = "*"
hatchling = "*"
[feature.dev.pypi-dependencies] # or [pypi-dependencies]
polarify = {path = ".", editable = true, ignore-dependencies = true, build-isolation = false}
[feature.prod.pypi-dependencies]
polarify = {path = ".", editable = false, ignore-dependencies = true, build-isolation = false}
[environments]
default = { features = ["dev"], solve-group = "prod" }
prod = { features = ["prod"], solve-group = "prod" }
here, hatchling
also isn't in the prod
and default
environment as well and the wheel is built inside the specific host environment.
The goal for the users is to make the solves faster and possible on more systems. The old projects should stay working like they used to.
We'll do this by allowing a user to define a pypi build environment.
pixi.toml
# Add a table to specify the build dependencies used in the pypi solve.
[pypi-build-dependencies]
python = "*"
# You can define it per feature
[feature.cpp.pypi-build-dependencies]
compilers = "*"
# You can define it per target
[target.linux.pypi-build-dependencies]
python = {build_number = "1"}
[feature.py39.dependencies]
python = "3.9"
boltons = "*"
[pypi-dependencies]
pytest = "*"
[environments]
default = ["default"]
cpp = ["default", "cpp"]
py39 = {no-default-features = true, features = ["py39"]}
pyproject.toml
[project]
name = "project"
requires-python = ">=3.10"
dependencies = ["pytest", "numpy"]
[tool.pixi.project]
channels = ["conda-forge"]
platforms = ["linux-64"]
[tool.pixi.dependencies]
pytorch = "*"
[tool.pixi.pypi-build-dependencies]
# Automatically inherit the python dependency version from the default env, which is the minimal version of all platforms solved for.
# Could break if different version of python in different platforms are used, we could go around this by installing a build-env per platform.
python = "*"
# Specifically install this version of compilers.
compilers = "1.2.3"
# Inherit from default env if version matches matchspec, otherwise install using matchspec.
cmake = ">=3.23"
The code logic will consist of the following steps:
pypi-build-dependencies is defined
, try to use that.
pypi-build-dependencies
pypi-build-dependencies
env after locking.If you can't install the pypi-build-dependencies
on current platform, this is something we can't solve yet. We're thinking about allowing the users to specify all required meta-data or allowing for an "unsolved" environment in your lockfile.
Addition to the Edge-case would be that we took that idea from uv: https://docs.astral.sh/uv/concepts/resolution/#dependency-metadata. But because of conda we would need it less.
Problem description
The current implementation uses the conda environments as a base environments for the building of python packages source distributions. It provides the python interpreter to the uv solve that pixi triggers when solving pypi-dependencies. Even without needing to install an environment, if said environment contains pypi-dependencies the conda prefix for that environment still needs to be installed, this is because we do not know before-hand if there are any source dists that need to be built.
This results in multiple downsides:
--no-install
or alock
(that does no install) command when an environment contains pypi-dependencies https://github.com/prefix-dev/pixi/issues/1131Proposal
Give the user the ability to create a custom build environment for the pypi-dependencies which only contains a minimal set of requirements which can be installed on all systems. For example only python, skipping all other dependencies. This also allows for more options, like installing specific dependencies for building a package, and in conjunction with the addition of https://github.com/prefix-dev/pixi/issues/1124 can avoid having to use pypi-dependencies for building a source dist altogether.
The following project can only be solved on a linux-64 machine:
Specification in manifest
Option 1: Add build environment to environment
Pros:
Cons:
Option 2: A pypi-build-environment table
Pros:
[pypi-build-environment]
it is automatically inherited in all other environments.Cons:
Option 3: Use host-dependencies
This would traverse all environments and figure out the unique and re-usable build environments per environment.
Pros:
Cons:
pixi build
although you will probably want those dependencies in that caseHow is this going to be backwards compatible?
The current behavior doesn't change, so a pypi build environment will be an opt-in feature, as for simple use-cases it does work.
Alternative solutions to the described problems
There are some alternatives:
uv
so are eagerly selected.We do think that
wheel-only
is a good idea nonetheless that we would want to implement.