Closed akrherz closed 5 months ago
Indeed, $ mamba install snappy=1.1.10
and things are happy again.
@akrherz I think you've correctly identified the problem as the recent release of snappy 1.2. Looks like fixes for the incompatibility are in progress, eg https://github.com/conda-forge/snappy-feedstock/pull/36, https://github.com/conda-forge/conda-forge-repodata-patches-feedstock/pull/699, https://github.com/conda-forge/conda-forge-repodata-patches-feedstock/pull/700
When did you last update your environment? https://github.com/conda-forge/conda-forge-repodata-patches-feedstock/pull/700 should have covered everything (including libarrow 15
), so it would surprise me if it installing snappy 1.2 is still possible.
When did you last update your environment?
I tried it now and it still tried to update snappy to 1.2 . I usually get these things wrong, but my assumption is that since I have pyarrow already installed without the metadata patch and that since I am running mamba update --all
, it won't see the repo metadata patch that pinned back pyarrow for snappy. I know that mamba update --all
is not recommend for this very reason :) I am fine with closing this. Thanks all for the help!
I just tried installing pyarrow
into a fresh environment, and it installs snappy 1.1.10. I then tried mamba update --all
and the snappy version didn't change. Can you try reproducing your issue with a fresh environment?
I just tried installing
pyarrow
into a fresh environment, and it installs snappy 1.1.10. I then triedmamba update --all
and the snappy version didn't change. Can you try reproducing your issue with a fresh environment?
That's what I get too, but that gets the new pyarrow metadata that my current install does not have.
but that gets the new pyarrow metadata that my current install does not have.
I do not understand what you're saying here. It sounds like it's fixed for a fresh environment (i.e. this issue is fixed as much as we can). Normally though, even for your current install, mamba update --all
should install the correct snappy.
(assuming it uses up-to-date repodata, i.e. you're not sitting behind some stale proxy)
Shrug, running now
$ conda clean --all
$ mamba update --all
...
pkgs/main/linux-64 (check zst) Checked 0.1s
pkgs/main/noarch (check zst) Checked 0.0s
pkgs/r/linux-64 (check zst) Checked 0.0s
pkgs/r/noarch (check zst) Checked 0.0s
pkgs/main/noarch 705.1kB @ 4.4MB/s 0.2s
pkgs/r/linux-64 1.6MB @ 9.4MB/s 0.2s
pkgs/r/noarch 2.1MB @ 7.7MB/s 0.1s
pkgs/main/linux-64 5.9MB @ 22.2MB/s 0.3s
conda-forge/noarch 14.2MB @ 25.6MB/s 0.6s
conda-forge/linux-64 33.6MB @ 12.8MB/s 2.6s
...
- snappy 1.1.10 hdb0a2a9_1 conda-forge Cached
+ snappy 1.2.0 hdb0a2a9_1 conda-forge 42kB
...
``
If I manually edit conda-meta/libarrow-15.0.2-hb86450c_1_cpu.json
and change the snappy dependency to
"snappy >=1.1.10,<1.2.0"
then mamba update --all
won't offer a snappy update. I think this is normal behavior.
OK, for reasons I'm not 100% certain about, your conda/mamba seems to prefer the local metadata over the up-to-date one from the repodata. I honestly don't know how this happens, but I just tried mamba install libarrow=15.0.2=hb86450c_1_cpu snappy=1.2
on linux-64, and it gives me a conflict (as it should).
In short, I think this issue can be closed.
I agree, thank you!
Could you please try running ...?
CONDA_LOCAL_REPODATA_TTL=0 conda update --all
@jakirkham Sure, I backed out my manual metadata change and here is the result
$ CONDA_LOCAL_REPODATA_TTL=0 conda update --all
Retrieving notices: ...working... done
Channels:
- conda-forge
- defaults
Platform: linux-64
Collecting package metadata (repodata.json): done
Solving environment: done
## Package Plan ##
environment location: /opt/miniconda3/envs/prod
The following packages will be downloaded:
package | build
---------------------------|-----------------
blosc-1.21.5 | hc2324a3_1 48 KB conda-forge
freetype-py-2.3.0 | pyhd8ed1ab_0 58 KB conda-forge
libopenvino-2024.0.0 | h2da1b83_5 4.9 MB conda-forge
libopenvino-auto-batch-plugin-2024.0.0| hb045406_5 108 KB conda-forge
libopenvino-auto-plugin-2024.0.0| hb045406_5 224 KB conda-forge
libopenvino-hetero-plugin-2024.0.0| h5c03a75_5 176 KB conda-forge
libopenvino-intel-cpu-plugin-2024.0.0| h2da1b83_5 10.1 MB conda-forge
libopenvino-intel-gpu-plugin-2024.0.0| h2da1b83_5 8.0 MB conda-forge
libopenvino-ir-frontend-2024.0.0| h5c03a75_5 196 KB conda-forge
libopenvino-onnx-frontend-2024.0.0| h07e8aee_5 1.5 MB conda-forge
libopenvino-paddle-frontend-2024.0.0| h07e8aee_5 679 KB conda-forge
libopenvino-pytorch-frontend-2024.0.0| he02047a_5 1.0 MB conda-forge
libopenvino-tensorflow-frontend-2024.0.0| h39126c6_5 1.2 MB conda-forge
libopenvino-tensorflow-lite-frontend-2024.0.0| he02047a_5 466 KB conda-forge
orc-2.0.0 | h17fec99_1 1005 KB conda-forge
reportlab-4.1.0 | py311h459d7ec_0 2.6 MB conda-forge
snappy-1.2.0 | hdb0a2a9_1 41 KB conda-forge
tweepy-4.8.0 | pyhd8ed1ab_0 59 KB conda-forge
uriparser-0.9.7 | h59595ed_1 47 KB conda-forge
------------------------------------------------------------
Total: 32.4 MB
The following NEW packages will be INSTALLED:
chardet conda-forge/linux-64::chardet-5.2.0-py311h38be061_1
The following packages will be UPDATED:
blosc 1.21.5-h0f2a231_0 --> 1.21.5-hc2324a3_1
libopenvino 2024.0.0-h2e90f83_4 --> 2024.0.0-h2da1b83_5
libopenvino-auto-~ 2024.0.0-hd5fc58b_4 --> 2024.0.0-hb045406_5
libopenvino-auto-~ 2024.0.0-hd5fc58b_4 --> 2024.0.0-hb045406_5
libopenvino-heter~ 2024.0.0-h3ecfda7_4 --> 2024.0.0-h5c03a75_5
libopenvino-intel~ 2024.0.0-h2e90f83_4 --> 2024.0.0-h2da1b83_5
libopenvino-intel~ 2024.0.0-h2e90f83_4 --> 2024.0.0-h2da1b83_5
libopenvino-ir-fr~ 2024.0.0-h3ecfda7_4 --> 2024.0.0-h5c03a75_5
libopenvino-onnx-~ 2024.0.0-h757c851_4 --> 2024.0.0-h07e8aee_5
libopenvino-paddl~ 2024.0.0-h757c851_4 --> 2024.0.0-h07e8aee_5
libopenvino-pytor~ 2024.0.0-h59595ed_4 --> 2024.0.0-he02047a_5
libopenvino-tenso~ 2024.0.0-hca94c1a_4 --> 2024.0.0-h39126c6_5
libopenvino-tenso~ 2024.0.0-h59595ed_4 --> 2024.0.0-he02047a_5
orc 2.0.0-h1e5e2c1_0 --> 2.0.0-h17fec99_1
reportlab 4.0.0-py311h459d7ec_0 --> 4.1.0-py311h459d7ec_0
requests-oauthlib 1.4.0-pyhd8ed1ab_0 --> 2.0.0-pyhd8ed1ab_0
snappy 1.1.10-hdb0a2a9_1 --> 1.2.0-hdb0a2a9_1
The following packages will be DOWNGRADED:
freetype-py 2.4.0-pyhd8ed1ab_0 --> 2.3.0-pyhd8ed1ab_0
tweepy 4.14.0-pyhd8ed1ab_0 --> 4.8.0-pyhd8ed1ab_0
uriparser 0.9.7-hcb278e6_1 --> 0.9.7-h59595ed_1
Proceed ([y]/n)? n
CondaSystemExit: Exiting.
My naive understanding of all this is that conda repodata patches do not impact already installed packages, so their metadata is not updated unless the package gets updated.
If so, it should be possible to do
conda remove --force snappy
conda clean -tipy
conda install snappy
@jakirkham Thanks for your interest and time in this. I don't wish to waste your folks valuable time on a somewhat unsupported situation of running conda update --all
. Anyway, here's what I got running your request
$ conda remove --force snappy
## Package Plan ##
environment location: /opt/miniconda3/envs/prod
removed specs:
- snappy
The following packages will be REMOVED:
snappy-1.1.10-hdb0a2a9_1
Proceed ([y]/n)? y
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
$ conda clean -tipy
There are no unused tarball(s) to remove.
Will remove 1 index cache(s).
$ conda install snappy
Channels:
- conda-forge
- defaults
Platform: linux-64
Collecting package metadata (repodata.json): done
Solving environment: done
## Package Plan ##
environment location: /opt/miniconda3/envs/prod
added / updated specs:
- snappy
The following packages will be downloaded:
package | build
---------------------------|-----------------
snappy-1.2.0 | hdb0a2a9_1 41 KB conda-forge
------------------------------------------------------------
Total: 41 KB
The following NEW packages will be INSTALLED:
snappy conda-forge/linux-64::snappy-1.2.0-hdb0a2a9_1
Proceed ([y]/n)? n
I don't understand why this is thought to work as the snappy metadata is not at issue here, the metadata for any of the current packages that I currently have installed that depend on snappy < 2.0.0
is.
This line is concerning
There are no unused tarball(s) to remove.
Is this package still being used in other environments?
If so, would repeat the removal procedure with all of them. Would do this before cleaning the cache and reinstalling (in all of them)
There are no unused tarball(s) to remove.
I think this is just a red herring as I routinely run conda clean --all -y
after every update to save space. So the tarball was already gone.
FWIW and from my understanding, this works to get my environment to avoid auto updating snappy, force updating a package that depends on snappy and not snappy itself
$ mamba update --all
...
- snappy 1.1.10 hdb0a2a9_1 conda-forge Cached
+ snappy 1.2.0 hdb0a2a9_1 conda-forge 42kB
...
(don't apply)
$ mamba remove --force libarrow
$ mamba install libarrow
+ libarrow 15.0.2 h176673d_2_cpu conda-forge 8MB
$ mamba update --all
(snappy not shown for updating)
For completeness, #1366 rebuilt 15.0.2 with snappy 1.2.0, so with the repo data patch and build 3 now available, there is no issue.
Solution to issue cannot be found in the documentation.
Issue
I attempt to
mamba update --all
my conda-forge environment on linux 64 each day. After doing so this morning, I ended up with the following errorMy ignorant guess is that this update broke things
Installed packages
Environment info