Open rahuliyer95 opened 1 month ago
I think we're somewhat unlikely to support this... The built wheel needs to use the declared project metadata, not the resolved application versions. It might also violate the spec in some sense.
What problem are you trying to solve? What are you looking to do with the wheel?
What problem are you trying to solve?
I'll try to explain without sharing too much internal details. Our setup for ETL jobs require us to install wheels from our internal PyPI installations (because of various limitations on support for alternatives like Docker images). When the ETL job starts the first thing it does is
pip install --index <internal-pypi-index-url> <package-name>==<package-version>
# with above example
# pip install --index <internal-pypi-index-url> playground==1.0.0
With the above wheel it would end up resolving versions for dependencies when it's installing the wheel and it might install a different patch version than the one we tested with. To avoid this problem I was hoping that we can build the wheel from the pinned versions. Our existing setup uses poetry and we use the poetry-freeze-wheel plugin to solve this problem.
I think we're somewhat unlikely to support this... The built wheel needs to use the declared project metadata, not the resolved application versions. It might also violate the spec in some sense.
I can totally understand the complexity of this very non-standard use-case. Unfortunately, I am not sure how many others want a behavior like this (through some CLI option maybe). I was trying to migrate from poetry
to uv
and this was the last blocker on the list.
@charliermarsh Please let me know if this explains the use-case and if any other details are needed from me. Thanks for all the amazing work you do!
I'd love to try to get some sort of --locked
install concept into the standards, perhaps after we manage to standardize on a lock format.
This issue is also the blocker for me switching from Poetry.
Locked dependencies in wheels allow leaving Docker and the like out completely and making apps users can install with a single pipx command.
As I understand Poetry plugins do it by building the wheel, then modifying it after, so it doesn't seem like it's too hard to implement since it's basically just copying the data you already have in uv.lock, right?
Let me share an experience to illustrate why using pinned dependencies during the build process is crucial.
I maintain a package (CLI tool) called My Package, which lists aiodocker as a dependency in its pyproject.toml. The aiodocker package, in turn, specifies aiohttp with a version constraint of ^3.8 in its pyproject.toml. Since my package doesn’t directly use aiohttp, it’s not listed as a direct dependency.
In our GitHub Actions (GHA) workflows, my package is installed frequently from a private artifact. Each installation pulls the latest compatible version of aiohttp because it’s a transitive dependency of aiodocker and satisfies the ^3.8 version constraint.
This week, a new version of aiohttp (3.11) introduced a breaking change that caused aiodocker to break. As a result, my package—despite no changes on my end—broke because of this upstream issue.
Here’s the related issue for reference: aio-libs/aiodocker#918.
Currently i use poetry and they don't have a solution for that: https://github.com/python-poetry/poetry/issues/2778
@idan-rahamim-lendbuzz poetry does have third party plugins like https://github.com/cloud-custodian/poetry-plugin-freeze, but I'm not aware of a UV equivalent yet
@idan-rahamim-lendbuzz poetry does have third party plugins like https://github.com/cloud-custodian/poetry-plugin-freeze, but I'm not aware of a UV equivalent yet
I can't use such plugins due to security reasons.
I believe that this option would be interesting to ensure that the users get the same exact set of project dependencies that was used during the development process. Just to make sure that the experience is smooth.
As of now, my team uses setup.py
in our old workflow with install_requires
a frozen requirements.txt.
But the idea behind it is that we force our package to be installed with the set of dependencies that we now for sure are working.
Example of a bad scenario was illustrated by @rahuliyer95, and it is exactly what I'm talking about. Sometimes things just break with newer versions and it is hard to track those sometimes.
Though I believe that this not a best practice in general, but in corporate setting with some flows this is gold.
(I believe this question has probably been asked before, but for some reason I am not able to find the previous issue so please feel free to direct me that issue if you are able to find it)
I am trying to build a wheel for my project, for which I simply ran
Inspecting the wheel I noticed that it took the dependencies from
pyproject.toml
. Is there a way to use the pinned dependencies from theuv.lock
file itself?
```toml [project] name = "playground" version = "1.0.0" description = "Playground" authors = [{ name = "Rahul Iyer", email = "me@rahuliyer.me" }] requires-python = ">=3.10" readme = "README.md" dependencies = [ "aiofiles~=24.1", "fsspec~=2024.0", "matplotlib~=3.7", "mpire[dill]~=2.8", "numpy~=1.26", "pandas==1.5.3", "pendulum~=3.0", "pyarrow~=16.0", "pyyaml~=6.0", "s3fs~=2024.0", "tqdm~=4.0", "universal-pathlib~=0.2", "uvloop~=0.19", "yarl~=1.8", ] [tool.uv] dev-dependencies = [ "mypy~=1.11", "pandas-stubs~=1.5.3", "ptpython~=3.0", "pytest~=7.2", "ruff~=0.4", "types-aiofiles~=23.2", "types-certifi~=2021.10", "types-pyyaml~=6.0", "types-tqdm~=4.0", ] [tool.ruff] exclude = [".venv"] line-length = 100 target-version = "py310" [tool.ruff.format] docstring-code-format = true ```pyproject.toml