mamba-org / mamba

The Fast Cross-Platform Package Manager
https://mamba.readthedocs.io
BSD 3-Clause "New" or "Revised" License
6.91k stars 356 forks source link

Unable to pin pytorch with cuda. #2194

Closed chebee7i closed 1 year ago

chebee7i commented 1 year ago

Troubleshooting docs

Search tried in issue tracker

pytorch cuda

Latest version of Mamba

Tried in Conda?

Reproducible with Conda using experimental solver.

Describe your issue

I am having issues specifying a version of pytorch (1.13) while also ensuring that I get the cuda version. See the pasted info in the other forms. The essential error is:

 Encountered problems while solving:
      - nothing provides cuda 11.6.* needed by pytorch-cuda-11.6-h867d48c_0

when specifying pytorch::pytorch=1.13.*=*cuda*.

How can I pin pytorch to 1.13 and force cuda?

The packages look to be available:

$ micromamba search -c conda-forge cudatoolkit  | grep 11.6
 cudatoolkit 11.6.0   hecad31d_11 conda-forge/linux-64
 cudatoolkit 11.6.0   habf752d_10 conda-forge/linux-64
 cudatoolkit 11.6.0   habf752d_9  conda-forge/linux-64
 cudatoolkit 11.6.0   hecad31d_10 conda-forge/linux-64

$ micromamba search -c pytorch pytorch | grep 3.8 | grep 1.13
 pytorch 1.13.1     py3.8_cpu_0                          pytorch/linux-64
 pytorch 1.13.1     py3.8_cuda11.6_cudnn8.3.2_0          pytorch/linux-64
 pytorch 1.13.1     py3.8_cuda11.7_cudnn8.5.0_0          pytorch/linux-64
 pytorch 1.13.0     py3.8_cuda11.7_cudnn8.5.0_0          pytorch/linux-64
 pytorch 1.13.0     py3.8_cpu_0                          pytorch/linux-64
 pytorch 1.13.0     py3.8_cuda11.6_cudnn8.3.2_0          pytorch/linux-64

When I do try a similar install using conda ala:

CONDA_OVERRIDE_CUDA=11.6 conda env create -f env.yml --experimental-solver=libmamba

then I get a similar error. Note that I am building on a host that does not have a GPU...this is for later installation from lock file on a host that does have a GPU.


When I do not pin the version and use pytorch::pytorch=*=*cuda* instead, then there is a successful install, but it obviously isn't guaranteed to give the package version that I want. In this case, it gives 1.12.1.

$ cat env.yml
name: testenv
channels:
  - pytorch
  - conda-forge
dependencies:
  - python >=3.8,<3.9
  - pytorch::pytorch=*=*cuda*
  # - pytorch::pytorch=1.13.*=*cuda*

$ micromamba create -f env.yml -v
          __  ______ ___  ____ _____ ___  / /_  ____ _
         / / / / __ `__ \/ __ `/ __ `__ \/ __ \/ __ `/
        / /_/ / / / / / / /_/ / / / / / / /_/ / /_/ /
       / .___/_/ /_/ /_/\__,_/_/ /_/ /_/_.___/\__,_/
      /_/

info     libmamba Parsing MatchSpec python >=3.8,<3.9
info     libmamba Parsing MatchSpec pytorch::pytorch=*=*cuda*
info     libmamba Parsing MatchSpec cudatoolkit=11.6
info     libmamba Searching index cache file for repo 'https://conda.anaconda.org/pytorch/linux-64/repodata.json'
info     libmamba Found cache at '/home/username/micromamba/pkgs/cache/ee0ed9e9.json'
pytorch/linux-64                                            Using cache
info     libmamba Searching index cache file for repo 'https://conda.anaconda.org/pytorch/noarch/repodata.json'
info     libmamba Found cache at '/home/username/micromamba/pkgs/cache/edb1952f.json'
pytorch/noarch                                              Using cache
info     libmamba Searching index cache file for repo 'https://conda.anaconda.org/conda-forge/linux-64/repodata.json'
info     libmamba Found cache at '/home/username/micromamba/pkgs/cache/497deca9.json'
conda-forge/linux-64                                        Using cache
info     libmamba Searching index cache file for repo 'https://conda.anaconda.org/conda-forge/noarch/repodata.json'
info     libmamba Found cache at '/home/username/micromamba/pkgs/cache/09cdf8bf.json'
conda-forge/noarch                                          Using cache
info     libmamba All targets to download are cached
info     libmamba Reading cache files '/home/username/micromamba/pkgs/cache/ee0ed9e9.*' for repo index 'https://conda.anaconda.org/pytorch/linux-64'
info     libmamba Reading cache files '/home/username/micromamba/pkgs/cache/edb1952f.*' for repo index 'https://conda.anaconda.org/pytorch/noarch'
info     libmamba Reading cache files '/home/username/micromamba/pkgs/cache/497deca9.*' for repo index 'https://conda.anaconda.org/conda-forge/linux-64'
info     libmamba Reading cache files '/home/username/micromamba/pkgs/cache/09cdf8bf.*' for repo index 'https://conda.anaconda.org/conda-forge/noarch'
info     libmamba Adding package record to repo __archspec
info     libmamba Adding package record to repo __glibc
info     libmamba Adding package record to repo __linux
info     libmamba Adding package record to repo __unix
info     libmamba Parsing MatchSpec python >=3.8,<3.9
info     libmamba Parsing MatchSpec python >=3.8,<3.9
info     libmamba Adding job: python >=3.8,<3.9
info     libmamba Parsing MatchSpec pytorch::pytorch=*=*cuda*
info     libmamba Parsing MatchSpec pytorch::pytorch=*=*cuda*
info     libmamba Parsing MatchSpec cudatoolkit=11.6
info     libmamba Parsing MatchSpec cudatoolkit=11.6
info     libmamba Adding job: cudatoolkit =11.6*
info     libmamba Problem count: 0
info     libmamba Found python version in packages to be installed 3.8.15

Transaction

  Prefix: /home/username/micromamba/envs/testenv

  Updating specs:

   - python[version='>=3.8,<3.9']
   - pytorch::pytorch=*[build=*cuda*]
   - cudatoolkit=11.6

  Package                Version  Build                        Channel                   Size
───────────────────────────────────────────────────────────────────────────────────────────────
  Install:
──────────────────────────────────────────────────────────────────────────â<94><80>────────────────────

  + _libgcc_mutex            0.1  conda_forge                  conda-forge/linux-64       3kB
  + _openmp_mutex            4.5  2_kmp_llvm                   conda-forge/linux-64       6kB
  + blas                   2.116  mkl                          conda-forge/linux-64      13kB
  + blas-devel             3.9.0  16_linux64_mkl               conda-forge/linux-64      13kB
  + bzip2                  1.0.8  h7f98852_4                   conda-forge/linux-64     496kB
  + ca-certificates    2022.12.7  ha878542_0                   conda-forge/linux-64     146kB
  + cudatoolkit           11.6.0  hecad31d_11                  conda-forge/linux-64     568MB
  + icu                     70.1  h27087fc_0                   conda-forge/linux-64      14MB
  + ld_impl_linux-64        2.39  hcc3a1bd_1                   conda-forge/linux-64     691kB
  + libblas                3.9.0  16_linux64_mkl               conda-forge/linux-64      13kB
  + libcblas               3.9.0  16_linux64_mkl               conda-forge/linux-64      13kB
  + libffi                 3.4.2  h7f98852_5                   conda-forge/linux-64      58kB
  + libgcc-ng             12.2.0  h65d4601_19                  conda-forge/linux-64     954kB
  + libgfortran-ng        12.2.0  h69a702a_19                  conda-forge/linux-64      23kB
  + libgfortran5          12.2.0  h337968e_19                  conda-forge/linux-64       2MB
  + libhwloc               2.8.0  h32351e8_1                   conda-forge/linux-64       3MB
  + libiconv                1.17  h166bdaf_0                   conda-forge/linux-64       1MB
  + liblapack              3.9.0  16_linux64_mkl               conda-forge/linux-64      13kB
  + liblapacke             3.9.0  16_linux64_mkl               conda-forge/linux-64      13kB
  + libnsl                 2.0.0  h7f98852_0                   conda-forge/linux-64      31kB
  + libsqlite             3.40.0  h753d276_0                   conda-forge/linux-64     810kB
  + libstdcxx-ng          12.2.0  h46fd767_19                  conda-forge/linux-64       4MB
  + libuuid               2.32.1  h7f98852_1000                conda-forge/linux-64      28kB
  + libxml2               2.10.3  h7463322_0                   conda-forge/linux-64     773kB
  + libzlib               1.2.13  h166bdaf_4                   conda-forge/linux-64      66kB
  + llvm-openmp           15.0.6  he0ac6c6_0                   conda-forge/linux-64       3MB
  + mkl                 2022.1.0  h84fe81f_915                 conda-forge/linux-64     209MB
  + mkl-devel           2022.1.0  ha770c72_916                 conda-forge/linux-64      26kB
  + mkl-include         2022.1.0  h84fe81f_915                 conda-forge/linux-64     763kB
  + ncurses                  6.3  h27087fc_1                   conda-forge/linux-64       1MB
  + openssl                3.0.7  h0b41bf4_1                   conda-forge/linux-64       3MB
  + pip                   22.3.1  pyhd8ed1ab_0                 conda-forge/noarch         2MB
  + python                3.8.15  h4a9ceb5_0_cpython           conda-forge/linux-64      21MB
  + pytorch               1.12.1  py3.8_cuda11.6_cudnn8.3.2_0  pytorch/linux-64           1GB
  + pytorch-mutex            1.0  cuda                         pytorch/noarch             3kB
  + readline               8.1.2  h0f457ee_0                   conda-forge/linux-64     298kB
  + setuptools            65.6.3  pyhd8ed1ab_0                 conda-forge/noarch       634kB
  + tbb                 2021.7.0  h924138e_1                   conda-forge/linux-64       2MB
  + tk                    8.6.12  h27826a3_0                   conda-forge/linux-64       3MB
  + typing_extensions      4.4.0  pyha770c72_0                 conda-forge/noarch        30kB
  + wheel                 0.38.4  pyhd8ed1ab_0                 conda-forge/noarch        33kB
  + xz                     5.2.6  h166bdaf_0                   conda-forge/linux-64     418kB

  Summary:

  Install: 42 packages
  Total download: 2GB

───────────────────────────────────────────────────────────────────────────────────────────────

Confirm changes: [Y/n]

mamba info / micromamba info

__
          __  ______ ___  ____ _____ ___  / /_  ____ _
         / / / / __ `__ \/ __ `/ __ `__ \/ __ \/ __ `/
        / /_/ / / / / / / /_/ / / / / / / /_/ / /_/ /
       / .___/_/ /_/ /_/\__,_/_/ /_/ /_/_.___/\__,_/
      /_/

            environment : None (not found)
           env location : -
      user config files : /home/username/.mambarc
 populated config files :
       libmamba version : 1.1.0
     micromamba version : 1.1.0
           curl version : libcurl/7.86.0 OpenSSL/1.1.1s zlib/1.2.13 libssh2/1.10.0 nghttp2/1.47.0
     libarchive version : libarchive 3.6.1 zlib/1.2.13 bz2lib/1.0.8 libzstd/1.5.2
       virtual packages : __unix=0=0
                          __linux=4.18.0=0
                          __glibc=2.28=0
                          __archspec=1=x86_64
               channels :
       base environment : /home/username/micromamba
               platform : linux-64

Logs

__
          __  ______ ___  ____ _____ ___  / /_  ____ _
         / / / / __ `__ \/ __ `/ __ `__ \/ __ \/ __ `/
        / /_/ / / / / / / /_/ / / / / / / /_/ / /_/ /
       / .___/_/ /_/ /_/\__,_/_/ /_/ /_/_.___/\__,_/
      /_/

info     libmamba Parsing MatchSpec python >=3.8,<3.9
info     libmamba Parsing MatchSpec pytorch::pytorch=1.13.*=*cuda*
info     libmamba Parsing MatchSpec cudatoolkit=11.6
info     libmamba Searching index cache file for repo 'https://conda.anaconda.org/pytorch/linux-64/repodata.json'
info     libmamba Found cache at '/home/username/micromamba/pkgs/cache/ee0ed9e9.json'
pytorch/linux-64                                            Using cache
info     libmamba Searching index cache file for repo 'https://conda.anaconda.org/pytorch/noarch/repodata.json'
info     libmamba Found cache at '/home/username/micromamba/pkgs/cache/edb1952f.json'
pytorch/noarch                                              Using cache
info     libmamba Searching index cache file for repo 'https://conda.anaconda.org/conda-forge/linux-64/repodata.json'
info     libmamba Found cache at '/home/username/micromamba/pkgs/cache/497deca9.json'
conda-forge/linux-64                                        Using cache
info     libmamba Searching index cache file for repo 'https://conda.anaconda.org/conda-forge/noarch/repodata.json'
info     libmamba Found cache at '/home/username/micromamba/pkgs/cache/09cdf8bf.json'
conda-forge/noarch                                          Using cache
info     libmamba All targets to download are cached
info     libmamba Reading cache files '/home/username/micromamba/pkgs/cache/ee0ed9e9.*' for repo index 'https://conda.anaconda.org/pytorch/linux-64'
info     libmamba Reading cache files '/home/username/micromamba/pkgs/cache/edb1952f.*' for repo index 'https://conda.anaconda.org/pytorch/noarch'
info     libmamba Reading cache files '/home/username/micromamba/pkgs/cache/497deca9.*' for repo index 'https://conda.anaconda.org/conda-forge/linux-64'
info     libmamba Reading cache files '/home/username/micromamba/pkgs/cache/09cdf8bf.*' for repo index 'https://conda.anaconda.org/conda-forge/noarch'
info     libmamba Adding package record to repo __archspec
info     libmamba Adding package record to repo __glibc
info     libmamba Adding package record to repo __linux
info     libmamba Adding package record to repo __unix
info     libmamba Parsing MatchSpec python >=3.8,<3.9
info     libmamba Parsing MatchSpec python >=3.8,<3.9
info     libmamba Adding job: python >=3.8,<3.9
info     libmamba Parsing MatchSpec pytorch::pytorch=1.13.*=*cuda*
info     libmamba Parsing MatchSpec pytorch::pytorch=1.13.*=*cuda*
info     libmamba Parsing MatchSpec cudatoolkit=11.6
info     libmamba Parsing MatchSpec cudatoolkit=11.6
info     libmamba Adding job: cudatoolkit =11.6*
info     libmamba Problem count: 1
error    libmamba Could not solve for environment specs
    Encountered problems while solving:
      - nothing provides cuda 11.6.* needed by pytorch-cuda-11.6-h867d48c_0

    The environment can't be solved, aborting the operation

info     libmamba Freeing solver.
info     libmamba Freeing pool.
critical libmamba Could not solve for environment specs

environment.yml

name: testenv
channels:
  - pytorch
  - conda-forge
dependencies:
  - python >=3.8,<3.9
    #- pytorch::pytorch=*=*cuda*
  - pytorch::pytorch=1.13.*=*cuda*

~/.condarc

None
chebee7i commented 1 year ago

I also tried a more specific version for torch: pytorch::pytorch=1.13.0=py3.8_cuda11.6*.

$ micromamba create -f env.yml -vvvv |& grep pytorch
info     libmamba Parsing MatchSpec pytorch::pytorch=1.13.0=py3.8_cuda11.6*
info     libmamba Searching index cache file for repo 'https://conda.anaconda.org/pytorch/linux-64/repodata.json'
pytorch/linux-64                                            Using cache
info     libmamba Searching index cache file for repo 'https://conda.anaconda.org/pytorch/noarch/repodata.json'
pytorch/noarch                                              Using cache
info     libmamba Reading cache files '/home/username/micromamba/pkgs/cache/ee0ed9e9.*' for repo index 'https://conda.anaconda.org/pytorch/linux-64'
info     libmamba Reading cache files '/home/username/micromamba/pkgs/cache/edb1952f.*' for repo index 'https://conda.anaconda.org/pytorch/noarch'
info     libmamba Parsing MatchSpec pytorch::pytorch=1.13.0=py3.8_cuda11.6*
info     libmamba Parsing MatchSpec pytorch::pytorch=1.13.0=py3.8_cuda11.6*
info     libsolv  job: install pytorch-1.13.0-py3.8_cuda11.6_cudnn8.3.2_0
info     libsolv      pytorch-1.13.0-py3.8_cuda11.6_cudnn8.3.2_0 [1078] (w1)
info     libsolv  conflicting pytorch-cuda-11.6-h867d48c_1 (assertion)
info     libsolv  conflicting pytorch-cuda-11.6-h867d48c_0 (assertion)
info     libsolv  installing  pytorch-1.13.0-py3.8_cuda11.6_cudnn8.3.2_0 (assertion)
info     libsolv  propagate decision -2325:    !pytorch-cuda-11.6-h867d48c_1 [2325] Conflict.level1
info     libsolv  propagate decision -2324:    !pytorch-cuda-11.6-h867d48c_0 [2324] Conflict.level1
info     libsolv      !pytorch-1.13.0-py3.8_cuda11.6_cudnn8.3.2_0 [1078] (w1) Install.level1
info     libsolv      pytorch-cuda-11.6-h867d48c_0 [2324] (w2) Conflict.level1
info     libsolv      pytorch-cuda-11.6-h867d48c_1 [2325] Conflict.level1
info     libsolv      pytorch-1.13.0-py3.8_cuda11.6_cudnn8.3.2_0 [1078] (w1) Install.level1
info     libsolv      !pytorch-cuda-11.6-h867d48c_0 [2324] (w1) Conflict.level1
info     libsolv      !pytorch-cuda-11.6-h867d48c_1 [2325] (w1) Conflict.level1
info     libsolv  conflicting pytorch-cuda-11.6-h867d48c_1 (assertion)
info     libsolv  conflicting pytorch-cuda-11.6-h867d48c_0 (assertion)
info     libsolv  propagate decision -2325:    !pytorch-cuda-11.6-h867d48c_1 [2325] Conflict.level1
info     libsolv  propagate decision -2324:    !pytorch-cuda-11.6-h867d48c_0 [2324] Conflict.level1
info     libsolv      !pytorch-1.13.0-py3.8_cuda11.6_cudnn8.3.2_0 [1078] (w1)
info     libsolv      pytorch-cuda-11.6-h867d48c_0 [2324] (w2) Conflict.level1
info     libsolv      pytorch-cuda-11.6-h867d48c_1 [2325] Conflict.level1
info     libsolv      -> decided to conflict pytorch-1.13.0-py3.8_cuda11.6_cudnn8.3.2_0
info     libsolv  propagate decision -1078:    !pytorch-1.13.0-py3.8_cuda11.6_cudnn8.3.2_0 [1078] Conflict.level1
      - nothing provides cuda 11.6.* needed by pytorch-cuda-11.6-h867d48c_0
jonashaag commented 1 year ago

Great report!

Want to try the new error messages? See https://github.com/mamba-org/mamba/issues/2078

chebee7i commented 1 year ago

Want to try the new error messages? See #2078

https://github.com/mamba-org/mamba/issues/2078#issuecomment-1368020953

The relevant bit that is new seems to be:

    =================================== Experimental messages (new) ====================================

critical libmamba Invalid dependency info: <NULL>

Is this saying that pytorch::pytorch=1.13.0=py3.8_cuda11.6* is an invalid package specification? The "dependency info" part makes me think not.

wolfv commented 1 year ago

does installing cuda 11.6.* work? Is the bug that it should read cuda 11.6?

wolfv commented 1 year ago

note that the micromamba search only lists cudatoolkit and not cuda.

wolfv commented 1 year ago

I think you need the nvidia channel: https://anaconda.org/nvidia/cuda

chebee7i commented 1 year ago

Interesting. That does solve the issue:

```console CONDA_OVERRIDE_CUDA=11.6 micromamba create -f env.yml __ __ ______ ___ ____ _____ ___ / /_ ____ _ / / / / __ `__ \/ __ `/ __ `__ \/ __ \/ __ `/ / /_/ / / / / / / /_/ / / / / / / /_/ / /_/ / / .___/_/ /_/ /_/\__,_/_/ /_/ /_/_.___/\__,_/ /_/ pytorch/noarch No change nvidia/noarch 2.6kB @ 7.5kB/s 0.3s pytorch/linux-64 No change nvidia/linux-64 96.3kB @ 239.4kB/s 0.4s conda-forge/noarch 10.8MB @ 3.7MB/s 2.6s conda-forge/linux-64 28.8MB @ 4.3MB/s 6.9s Transaction Prefix: /home/username/micromamba/envs/testenv Updating specs: - python[version='>=3.8,<3.9'] - pytorch::pytorch==1.13.0[build=py3.8_cuda11.6*] - cudatoolkit=11.6 Package Version Build Channel Size ──────────────────────────────────────────────────────────────────────────────────────────────────────── Install: ──────────────────────────────────────────────────────────────────────────────────────────────────────── + _libgcc_mutex 0.1 conda_forge conda-forge/linux-64 Cached + _openmp_mutex 4.5 2_kmp_llvm conda-forge/linux-64 6kB + blas 2.116 mkl conda-forge/linux-64 13kB + blas-devel 3.9.0 16_linux64_mkl conda-forge/linux-64 13kB + bzip2 1.0.8 h7f98852_4 conda-forge/linux-64 Cached + ca-certificates 2022.12.7 ha878542_0 conda-forge/linux-64 Cached + cuda 11.6.1 0 nvidia/linux-64 1kB + cuda-cccl 11.6.55 hf6102b2_0 nvidia/linux-64 1MB + cuda-command-line-tools 11.6.2 0 nvidia/linux-64 1kB + cuda-compiler 11.6.2 0 nvidia/linux-64 1kB + cuda-cudart 11.6.55 he381448_0 nvidia/linux-64 198kB + cuda-cudart-dev 11.6.55 h42ad0f4_0 nvidia/linux-64 1MB + cuda-cuobjdump 11.6.124 h2eeebcb_0 nvidia/linux-64 138kB + cuda-cupti 11.6.124 h86345e5_0 nvidia/linux-64 23MB + cuda-cuxxfilt 11.6.124 hecbf4f6_0 nvidia/linux-64 290kB + cuda-driver-dev 11.6.55 0 nvidia/linux-64 17kB + cuda-gdb 12.0.90 0 nvidia/linux-64 6MB + cuda-libraries 11.6.1 0 nvidia/linux-64 2kB + cuda-libraries-dev 11.6.1 0 nvidia/linux-64 2kB + cuda-memcheck 11.8.86 0 nvidia/linux-64 172kB + cuda-nsight 12.0.78 0 nvidia/linux-64 119MB + cuda-nsight-compute 12.0.0 0 nvidia/linux-64 1kB + cuda-nvcc 11.6.124 hbba6d2d_0 nvidia/linux-64 44MB + cuda-nvdisasm 12.0.76 0 nvidia/linux-64 50MB + cuda-nvml-dev 11.6.55 haa9ef22_0 nvidia/linux-64 67kB + cuda-nvprof 12.0.90 0 nvidia/linux-64 5MB + cuda-nvprune 11.6.124 he22ec0a_0 nvidia/linux-64 66kB + cuda-nvrtc 11.6.124 h020bade_0 nvidia/linux-64 18MB + cuda-nvrtc-dev 11.6.124 h249d397_0 nvidia/linux-64 18MB + cuda-nvtx 11.6.124 h0630a44_0 nvidia/linux-64 59kB + cuda-nvvp 12.0.90 0 nvidia/linux-64 120MB + cuda-runtime 11.6.1 0 nvidia/linux-64 1kB + cuda-samples 11.6.101 h8efea70_0 nvidia/linux-64 5kB + cuda-sanitizer-api 12.0.90 0 nvidia/linux-64 17MB + cuda-toolkit 11.6.1 0 nvidia/linux-64 1kB + cuda-tools 11.6.1 0 nvidia/linux-64 1kB + cuda-visual-tools 11.6.1 0 nvidia/linux-64 1kB + cudatoolkit 11.6.0 habf752d_9 nvidia/linux-64 861MB + gds-tools 1.5.0.59 0 nvidia/linux-64 43MB + icu 70.1 h27087fc_0 conda-forge/linux-64 14MB + ld_impl_linux-64 2.39 hcc3a1bd_1 conda-forge/linux-64 Cached + libblas 3.9.0 16_linux64_mkl conda-forge/linux-64 13kB + libcblas 3.9.0 16_linux64_mkl conda-forge/linux-64 13kB + libcublas 11.9.2.110 h5e84587_0 nvidia/linux-64 315MB + libcublas-dev 11.9.2.110 h5c901ab_0 nvidia/linux-64 326MB + libcufft 10.7.1.112 hf425ae0_0 nvidia/linux-64 98MB + libcufft-dev 10.7.1.112 ha5ce4c0_0 nvidia/linux-64 207MB + libcufile 1.5.0.59 0 nvidia/linux-64 772kB + libcufile-dev 1.5.0.59 0 nvidia/linux-64 13kB + libcurand 10.3.1.50 0 nvidia/linux-64 54MB + libcurand-dev 10.3.1.50 0 nvidia/linux-64 460kB + libcusolver 11.3.4.124 h33c3c4e_0 nvidia/linux-64 91MB + libcusparse 11.7.2.124 h7538f96_0 nvidia/linux-64 169MB + libcusparse-dev 11.7.2.124 hbbe9722_0 nvidia/linux-64 345MB + libffi 3.4.2 h7f98852_5 conda-forge/linux-64 Cached + libgcc-ng 12.2.0 h65d4601_19 conda-forge/linux-64 Cached + libgfortran-ng 12.2.0 h69a702a_19 conda-forge/linux-64 23kB + libgfortran5 12.2.0 h337968e_19 conda-forge/linux-64 2MB + libhwloc 2.8.0 h32351e8_1 conda-forge/linux-64 3MB + libiconv 1.17 h166bdaf_0 conda-forge/linux-64 1MB + liblapack 3.9.0 16_linux64_mkl conda-forge/linux-64 13kB + liblapacke 3.9.0 16_linux64_mkl conda-forge/linux-64 13kB + libnpp 11.6.3.124 hd2722f0_0 nvidia/linux-64 124MB + libnpp-dev 11.6.3.124 h3c42840_0 nvidia/linux-64 121MB + libnsl 2.0.0 h7f98852_0 conda-forge/linux-64 Cached + libnvjpeg 11.6.2.124 hd473ad6_0 nvidia/linux-64 2MB + libnvjpeg-dev 11.6.2.124 hb5906b9_0 nvidia/linux-64 2MB + libsqlite 3.40.0 h753d276_0 conda-forge/linux-64 Cached + libstdcxx-ng 12.2.0 h46fd767_19 conda-forge/linux-64 Cached + libuuid 2.32.1 h7f98852_1000 conda-forge/linux-64 Cached + libxml2 2.10.3 h7463322_0 conda-forge/linux-64 773kB + libzlib 1.2.13 h166bdaf_4 conda-forge/linux-64 Cached + llvm-openmp 15.0.6 he0ac6c6_0 conda-forge/linux-64 3MB + mkl 2022.1.0 h84fe81f_915 conda-forge/linux-64 209MB + mkl-devel 2022.1.0 ha770c72_916 conda-forge/linux-64 26kB + mkl-include 2022.1.0 h84fe81f_915 conda-forge/linux-64 763kB + ncurses 6.3 h27087fc_1 conda-forge/linux-64 Cached + nsight-compute 2022.4.0.15 0 nvidia/linux-64 801MB + openssl 3.0.7 h0b41bf4_1 conda-forge/linux-64 Cached + pip 22.3.1 pyhd8ed1ab_0 conda-forge/noarch Cached + python 3.8.15 h4a9ceb5_0_cpython conda-forge/linux-64 Cached + pytorch 1.13.0 py3.8_cuda11.6_cudnn8.3.2_0 pytorch/linux-64 1GB + pytorch-cuda 11.6 h867d48c_1 pytorch/noarch 3kB + pytorch-mutex 1.0 cuda pytorch/noarch 3kB + readline 8.1.2 h0f457ee_0 conda-forge/linux-64 Cached + setuptools 65.6.3 pyhd8ed1ab_0 conda-forge/noarch Cached + tbb 2021.7.0 h924138e_1 conda-forge/linux-64 2MB + tk 8.6.12 h27826a3_0 conda-forge/linux-64 Cached + typing_extensions 4.4.0 pyha770c72_0 conda-forge/noarch 30kB + wheel 0.38.4 pyhd8ed1ab_0 conda-forge/noarch Cached + xz 5.2.6 h166bdaf_0 conda-forge/linux-64 Cached Summary: Install: 91 packages Total download: 6GB ──────────────────────────────────────────────────────────────────────────────────────────────────────── Confirm changes: [Y/n] ```

However, note that when I install with pytorch::pytorch=*=*cuda*, the successfully built environment has only cudatoolkit and not cuda. Also, I have never needed to add the nvidia channel with conda (or with the successfully built mamba environment).

So it seems like there's a discrepancy here that still needs an explanation. Any thoughts?

wolfv commented 1 year ago

Not really, if the pytorch-cuda package depends on cuda, it depends on cuda. Could be that in the past they did things differently ... or that you got the cuda package from somewhere else (e.g. defaults channel?).

chebee7i commented 1 year ago

I can say confidently that my previous environments did not have the cuda package explicitly installed. So maybe it's just a requirements change.