Open nkrishnan opened 2 years ago
I think with some of the new features that are coming with the updatable lockfiles that should be doable.
The best solution for performance atm is to always ensure that you are running with
conda-lock --mamba
This will become the default behavior shortly (if mamba is detected on PATH)
For mamba, I've been experimenting with generating a transaction from an initial solve, and then use that transaction to create "fake installed" packages, and then continue modifying the transaction here: https://github.com/mamba-org/mamba/pull/1516
It's not done yet, but the goal is to support also a use case where you make an environment larger and then subtract packages you don't actually need (to make sure that the whole env is always consistent).
Maybe this could be useful in this use case as well, though.
The update mechanism presently used basically makes incremental environments, additively.
It makes a fake environment (similar to the fake virtual packages) with dependency packages added in with pins defined (overridden versions are allowed). This obviously doesn't quite have the same kind of solve as the trimming approach but does allow for the Base
, Base+X
, Base+Y
kinds of environment where X and Y are mutually exclusive.
A common use-case for conda-lock is when modifying an existing package dependency spec and regenerating the lock files. For this use case is it possible to add support for an "incremental" mode where we use the existing lock file as the answer of first resort to short circuit a more expensive search? And for users to optionally invoke via
conda-lock lock --bootstrap-lock-file <lockfile>
or similar.