Closed uckelman-sf closed 3 months ago
It looks like one could change all the ==
to >=
in pyproject.toml
, do pip install -e .[build]
, and then pip freeze >requirements.txt
to generate a requirements.txt
pinning the actual versions installed.
I'm not entirely sure yet how this interacts with building a wheel.
That sounds like a good place to start, and if the versioning is still too strict, we can relax to >=x.0.0
.
👍
If I build a wheel with the relaxed constraints in the pyproject.toml
, I can see that reflected in the flare_capa-7.0.1.dist-info/METADATA
file inside the .whl
. So... that's good.
However, the requirements.txt
where I pinned exact versions isn't in the wheel at all. Given that, anybody installing the wheel won't get the dependencies from the requirements.txt
. I think to get those, you'd need to do pip install -r requirements.txt
after installing the wheel, which means you'd have to have requirements.txt
---and you won't unless you've cloned the repo...
This is why my original suggestion was to have a separate project for the command line tool, as the pyroject.toml
for that could specify exact versions and those would show up correctly in the wheel for it.
Given that, anybody installing the wheel won't get the dependencies from the requirements.txt. I think to get those, you'd need to do pip install -r requirements.txt after installing the wheel, which means you'd have to have requirements.txt
In the workflow I imagine, the user does pip install -r requirements.txt
before pip install flare-capa
so that when pip resolves the second time, it finds all the dependencies already present and it's basically a no-op (well installs capa but nothing else).
Agree that you'd need source code for this. And I think this is expected behavior.
I don't think users will require the pinned environment often; I imagine this only to be for the PyInstaller build and tests in CI or anyone that wants to reproduce this. Do you envision other workflows that prefer the pinned environment? I suppose active dev. What else?
I won't ever want the pinned environment myself. I had been assuming that you want people installing the wheel in order to use it as a command-line tool to have the the dependencies pinned. But, if you don't care about that, we've got a solution already.
I'll make a PR for that for you today or tomorrow.
I just pushed #2079 as a first attempt. I'm hoping that I got the pip install -r requirements.txt
in the correct place for building the binary packages.
To be clear about how I'm thinking about the pinned version: we want some way to be able to assert that the environment in which all tests pass (in CI) is the same that we distribute in the standalone exe (which is the primary distribution method). We also want to make it easy for devs to recreate this environment to triage test failures or other bug reports. requirements.txt
enables this because source is available for each of these flows.
I don't expect most capa library users to need the pinned environment. We think most dependencies should generally work following semver and we can bump min versions when there's a particular need.
thanks @uckelman-sf for the initial PR!
Description
flare-floss 3.1.0 requires pydantic 2.6.0. No released version of flare-capa permits using pydantic 2.6.0; therefore, the two current releases cannot be used together.
Steps to Reproduce
Minimal pyproject.toml:
Then:
Expected behavior:
flare-capa and flare-floss are installed.
Actual behavior:
Versions
Every version of capa.
Additional Information
The immediate problem could be solved by a release of flare-capa which uses pydantic 2.6.0. However, problems of this sort could be avoided by setting constraints for dependencies which are looser.