prefix-dev / pixi

Package management made easy
https://pixi.sh
BSD 3-Clause "New" or "Revised" License
2.93k stars 161 forks source link

Tracking issue: PyPI dependencies pixi can't manage yet. #771

Open ruben-arts opened 7 months ago

ruben-arts commented 7 months ago

There are a few PyPI packages pixi can't install yet, where pip can.

Please paste your examples in this issue so we have a list of known packages we can track, test and benchmark with along the way.

Information we would like:

  1. What did you run and what was the outcome? e.g. pixi add --pypi packagex If it doesn't recreate in an empty environment please share your pixi.toml that recreates the issue.

  2. What error did pixi return? e.g.

    × RECORD file doesn't match wheel contents: missing hash for mediapipe/version.txt (expected sha256=-fE2KU)
  3. Can pip install the package? Does pip install packagex work?

  4. What platform are you on? e.g. linux-64

  5. Did you find a workaround, if so please explain. e.g. build it into a conda package, using a custom fork, etc.

Your input would greatly help us improve the pixi's experience! Thanks in advance! :heart:

pablovela5620 commented 7 months ago

Mediapipe

  1. pixi add --pypi mediapipe
  2. × RECORD file doesn't match wheel contents: missing hash for mediapipe/version.txt (expected sha256=-fE2KU)
  3. python -m pip install mediapipe does work
  4. platform osx-arm64
  5. work around was just to use pip to install using pixi tasks
[project]
name = "ipdscan"
version = "0.1.0"
description = "Add a short description here"
authors = ["pablovela5620 <pablovela5620@gmail.com>"]
channels = ["conda-forge"]
platforms = ["osx-arm64"]

[tasks]
mp-install = "python -m pip install mediapipe"

[dependencies]
python = "3.11.*"
pip = ">=23.3.2,<23.4"
rerun-sdk = ">=0.12.0,<0.13"
requests = ">=2.31.0,<2.32"
tqdm = ">=4.66.1,<4.67"

[pypi-dependencies]
imutils = "*"
tylerjw commented 7 months ago

I wanted to use this library that is packaged in pypi: https://github.com/spirali/elsie

Here is the error I was seeing when trying to use pixi add:

 WARN rattler_installs_packages::index::package_database: errors while processing source distributions:
  × failed to resolve `pypi-dependencies`, due to underlying error
  ╰─▶ No metadata could be extracted for the following available artifacts:
        - lxml-4.6.5.tar.gz

Error:   × error while processing source distribution 'lxml-4.6.5.tar.gz':
  │  could not build wheel: <string>:67: DeprecationWarning: pkg_resources is deprecated as an API. See https://
  │ setuptools.pypa.io/en/latest/pkg_resources.html
  │ 
  help: Probably an error during processing of source distributions. Please check the error message above.

Posting in the discord channel got it working with this fix:

pixi add python lxml
pixi add --pypi elsie

The reason for this is:

there seems to be an issue with one of the lxml source distributions. Meaning you have to locally build the package but that is not working because of the error it is giving you. pixi reports back on errorous packages where pip probably continues trying other versions.

liquidcarbon commented 7 months ago

pixi add duckdbpixi add --pypi duckdb ❌ vs pixi run pip install duckdb ✅ on Windows

PS C:\code> pixi init duckdb-pip
✔ Initialized project in C:\code\duckdb-pip
PS C:\code> cd .\duckdb-pip\
PS C:\code\duckdb-pip> pixi add python=3.11
✔ Added python=3.11
PS C:\code\duckdb-pip> pixi add duckdb     
  × could not determine any available versions for duckdb on win-64. Either the package could not be found or version constraints on other
  │ dependencies result in a conflict.
  ╰─▶ Cannot solve the request because of: No candidates were found for duckdb *.

PS C:\code\duckdb-pip> pixi add --pypi duckdb
  × could not build wheel: warning: no files found matching '*.h' under directory 'duckdb'
  │ warning: no files found matching '*.hpp' under directory 'duckdb'
  │ warning: no files found matching '*.cpp' under directory 'duckdb'
  │ warning: no files found matching '*.h' under directory 'src'
  │ warning: manifest_maker: MANIFEST.in, line 6: 'recursive-include' expects <dir> <pattern1> <pattern2> ...
  │
  │ C:\Users\AKISLU~1\AppData\Local\Temp\.tmpkvrxrc\venv\Lib\site-packages\setuptools\command\build_py.py:207: _Warning: Package 'duckdb-      
  │ stubs.value' is absent from the `packages` configuration.
  │ !!
  │
  │         ********************************************************************************
  │         ############################
  │         # Package would be ignored #
  │         ############################
  │         Python recognizes 'duckdb-stubs.value' as an importable package[^1],
  │         but it is absent from setuptools' `packages` configuration.
  │
  │         This leads to an ambiguous overall configuration. If you want to distribute this
  │         package, please make sure that 'duckdb-stubs.value' is explicitly added
  │         to the `packages` configuration field.
  │
  │         Alternatively, you can also rely on setuptools' discovery methods
  │         (for example by using `find_namespace_packages(...)`/`find_namespace:`
  │         instead of `find_packages(...)`/`find:`).
  │
  │         You can read more about "package discovery" on setuptools documentation page:
  │
  │         - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html
  │
  │         If you don't want 'duckdb-stubs.value' to be distributed and are
  │         already explicitly excluding 'duckdb-stubs.value' via
  │         `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`,
  │         you can try to use `exclude_package_data`, or `include-package-data=False` in
  │         combination with a more fine grained `package-data` configuration.
  │
  │         You can read more about "package data files" on setuptools documentation page:
  │
  │         - https://setuptools.pypa.io/en/latest/userguide/datafiles.html
  │
  │
  │         [^1]: For Python, any directory (with suitable naming) can be imported,
  │               even if it does not contain any `.py` files.
  │               On the other hand, currently there is no concept of package data
  │               directory, all directories are treated like packages.
  │         ********************************************************************************
  │
  │ !!
  │   check.warn(importable)
  │ C:\Users\AKISLU~1\AppData\Local\Temp\.tmpkvrxrc\venv\Lib\site-packages\setuptools\command\build_py.py:207: _Warning: Package 'duckdb-      
  │ stubs.value.constant' is absent from the `packages` configuration.
  │ !!
  │
  │         ********************************************************************************
  │         ############################
  │         # Package would be ignored #
  │         ############################
  │         Python recognizes 'duckdb-stubs.value.constant' as an importable package[^1],
  │         but it is absent from setuptools' `packages` configuration.
  │
  │         This leads to an ambiguous overall configuration. If you want to distribute this
  │         package, please make sure that 'duckdb-stubs.value.constant' is explicitly added
  │         to the `packages` configuration field.
  │
  │         Alternatively, you can also rely on setuptools' discovery methods
  │         (for example by using `find_namespace_packages(...)`/`find_namespace:`
  │         instead of `find_packages(...)`/`find:`).
  │
  │         You can read more about "package discovery" on setuptools documentation page:
  │
  │         - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html
  │
  │         If you don't want 'duckdb-stubs.value.constant' to be distributed and are
  │         already explicitly excluding 'duckdb-stubs.value.constant' via
  │         `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`,
  │         you can try to use `exclude_package_data`, or `include-package-data=False` in
  │         combination with a more fine grained `package-data` configuration.
  │
  │         You can read more about "package data files" on setuptools documentation page:
  │
  │         - https://setuptools.pypa.io/en/latest/userguide/datafiles.html
  │
  │
  │         [^1]: For Python, any directory (with suitable naming) can be imported,
  │               even if it does not contain any `.py` files.
  │               On the other hand, currently there is no concept of package data
  │               directory, all directories are treated like packages.
  │         ********************************************************************************
  │
  │ !!
  │   check.warn(importable)
  │ C:\Users\AKISLU~1\AppData\Local\Temp\.tmpkvrxrc\venv\Lib\site-packages\setuptools\command\build_py.py:207: _Warning: Package
  │ 'duckdb.experimental' is absent from the `packages` configuration.
  │ !!
  │
  │         ********************************************************************************
  │         ############################
  │         # Package would be ignored #
  │         ############################
  │         Python recognizes 'duckdb.experimental' as an importable package[^1],
  │         but it is absent from setuptools' `packages` configuration.
  │
  │         This leads to an ambiguous overall configuration. If you want to distribute this
  │         package, please make sure that 'duckdb.experimental' is explicitly added
  │         to the `packages` configuration field.
  │
  │         Alternatively, you can also rely on setuptools' discovery methods
  │         (for example by using `find_namespace_packages(...)`/`find_namespace:`
  │         instead of `find_packages(...)`/`find:`).
  │
  │         You can read more about "package discovery" on setuptools documentation page:
  │
  │         - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html
  │
  │         If you don't want 'duckdb.experimental' to be distributed and are
  │         already explicitly excluding 'duckdb.experimental' via
  │         `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`,
  │         you can try to use `exclude_package_data`, or `include-package-data=False` in
  │         combination with a more fine grained `package-data` configuration.
  │
  │         You can read more about "package data files" on setuptools documentation page:
  │
  │         - https://setuptools.pypa.io/en/latest/userguide/datafiles.html
  │
  │
  │         [^1]: For Python, any directory (with suitable naming) can be imported,
  │               even if it does not contain any `.py` files.
  │               On the other hand, currently there is no concept of package data
  │               directory, all directories are treated like packages.
  │         ********************************************************************************
  │
  │ !!
  │   check.warn(importable)
  │ error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/      
  │ visual-cpp-build-tools/
  │

PS C:\code\duckdb-pip> pixi add pip          
✔ Added pip
PS C:\code\duckdb-pip> pixi run pip install duckdb
Collecting duckdb
  Downloading duckdb-0.9.2-cp311-cp311-win_amd64.whl.metadata (798 bytes)
Downloading duckdb-0.9.2-cp311-cp311-win_amd64.whl (10.3 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 10.3/10.3 MB 21.8 MB/s eta 0:00:00
Installing collected packages: duckdb
Successfully installed duckdb-0.9.2
liquidcarbon commented 7 months ago
  1. work around was just to use pip to install using pixi tasks

@pablovela5620 I am also finding this necessary quite often, probably more often than @ruben-arts would like :)

In this pattern the environment definition is fragmented between pixi.toml tasks and pixi.lock. It could work, esp with pinned pip installs in tasks, but is it the intention?

ruben-arts commented 7 months ago

In this pattern the environment definition is fragmented between pixi.toml tasks and pixi.lock. It could work, esp with pinned pip installs in tasks, but is it the intention?

@liquidcarbon This is indeed not what we want as UX but we simply need to develop more to support all PyPI packages. Its a weird bunch of requirements we have to support to be equivalent to pip. So please keep posting non working packages!

ruben-arts commented 7 months ago

@liquidcarbon the duckdb on conda-forge has been continued on python-duckdb which should be available on windows. We'll keep the example for pypi to test more!

tdejager commented 7 months ago

@liquidcarbon I cannot reproduce this, neither on OSX or on windows. It's strange it does not select the .whl for some reason.

tdejager commented 7 months ago

@tylerjw actually it also fails for the same version in pip on apple silicon, like pip we error out when we cannot build the first source distribution.

tdejager commented 7 months ago

@pablovela5620 so it seems mediapipe 10.9 is a package with an invalid RECORD file, I manually checked it and it's incorrect.

This is mentioned in the PyPa

During extraction, wheel installers verify all the hashes in RECORD against the file contents. Apart from RECORD and its signatures, installation will fail if any file in the archive is not both mentioned and correctly hashed in RECORD.

Mediapipe has a version.txt that is not mentioned in the RECORD for the 10.9 release.

Which in this case triggers the error, I'm unsure why pip does no do this. But I feel its good to adhere to the standard here.

In any case, mediapipe 10.8 does seem to work, you could use that instead.

Also see: https://github.com/google/mediapipe/issues/5025

liquidcarbon commented 7 months ago

@liquidcarbon I cannot reproduce this, neither on OSX or on windows. It's strange it does not select the .whl for some reason.

@tdejager Just tried on Win10, original comment was on Win11 same thing:

PS C:\Users\a\Desktop\code\duckdb-pip> pixi add python=3.11 ✔ Added python 3.11. PS C:\Users\a\Desktop\code\duckdb-pip> pixi add duckdb × could not determine any available versions for duckdb on win-64. Either the package could not be found or version │ constraints on other dependencies result in a conflict. ╰─▶ Cannot solve the request because of: No candidates were found for duckdb .

I was on Pixi 0.9.1, upped to 0.13.0 - same thing

But pixi add --pypi duckdb works. 🤷‍♂️

ruben-arts commented 7 months ago

The conda package you add with pixi add duckdb should be pixi add python-duckdb

https://prefix.dev/channels/conda-forge/packages/python-duckdb

liquidcarbon commented 7 months ago

@ruben-arts noted -- I just wasn't sure which part @tdejager was trying to reproduce

awray3 commented 6 months ago

Tensorflow metal on Apple silicon MacOS 14.2.1:

pixi init tf-metal && cd tf-metal
pixi add "python>=3.11" "tensorflow>=2.13" pip
pixi add --pypi tensorflow-metal
> × failed to resolve `pypi-dependencies`, due to underlying error
  ╰─▶ The following packages are incompatible
      └─ tensorflow-metal * cannot be installed because there are no viable options:
         └─ tensorflow-metal 0.1.0 | 0.1.1 | 0.1.2 | 0.2.0 | 0.3.0 | 0.4.0 | 0.5.0 | 0.5.1 | 0.6.0 | 0.7.0 | 0.7.1
      | 0.8.0 | 1.0.0 | 1.0.1 | 1.1.0 is excluded because none of the artifacts are compatible with the Python
      interpreter or glibc version and there are no supported sdists

pixi run pip install tensorflow-metal
> Successfully installed tensorflow-metal-1.1.0
baszalmstra commented 6 months ago

Most likely you are missing a system requirement: https://pixi.sh/latest/configuration/#the-system-requirements-table

Most likely macos=12.0

awray3 commented 6 months ago

Nice, that fixed it. But would the generated pixi.toml work on a linux machine? I guess I would have to mark tensorflow-metal as platform dependent somehow. My goal is to have a pixi.toml that installs tensorflow metal on a mac and tensorflow with gpu support on linux.

EDIT: Nevermind, I found the example demonstrating how to do this. Thank you!

roaldarbol commented 6 months ago

Installing opencv-python-headless with conda works (through python -m pip install opencv-python-headless), but not with pixi. I'm on osx-64.

(base) ➜  idtracker pixi add --pypi opencv-python-headless
  × failed to resolve `pypi-dependencies`, due to underlying error
  ╰─▶ No metadata could be extracted for the following available artifacts:
        - opencv-python-headless-4.9.0.80.tar.gz

Error:   × error while processing source distribution 'opencv-python-headless-4.9.0.80.tar.gz':
  │  could not build wheel: Traceback (most recent call last):
  │   File "/var/folders/rh/y9ws4bgj7x50twypcwtk2ypr0000gp/T/.tmpvjLm8l/build_frontend.py", line 124, in <module>
  │     get_requires_for_build_wheel(backend, work_dir)
  │   File "/var/folders/rh/y9ws4bgj7x50twypcwtk2ypr0000gp/T/.tmpvjLm8l/build_frontend.py", line 58, in get_requires_for_build_wheel
  │     result = f()
  │              ^^^
  │   File "/var/folders/rh/y9ws4bgj7x50twypcwtk2ypr0000gp/T/.tmpvjLm8l/venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 325, in
  │ get_requires_for_build_wheel
  │     return self._get_build_requires(config_settings, requirements=['wheel'])
  │            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  │   File "/var/folders/rh/y9ws4bgj7x50twypcwtk2ypr0000gp/T/.tmpvjLm8l/venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 295, in _get_build_requires
  │     self.run_setup()
  │   File "/var/folders/rh/y9ws4bgj7x50twypcwtk2ypr0000gp/T/.tmpvjLm8l/venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 487, in run_setup
  │     super().run_setup(setup_script=setup_script)
  │   File "/var/folders/rh/y9ws4bgj7x50twypcwtk2ypr0000gp/T/.tmpvjLm8l/venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 311, in run_setup
  │     exec(code, locals())
  │   File "<string>", line 10, in <module>
  │ ModuleNotFoundError: No module named 'skbuild'
  │ 
  help: Probably an error during processing of source distributions. Please check the error message above.
baszalmstra commented 6 months ago

@roaldarbol The error message is absolutely terrible but if you add:

[system-requirements]
macos = "11.0"

It should work.

roaldarbol commented 6 months ago

It does indeed! Thanks!

jacobbieker commented 6 months ago

Trying to install torch-geometric-temporal with pixi add. It fails as it cannot find torch, which had already been install previously

  1. pixi add --pypi torch-geometric-temporal
  2. python -m pip install torch-geomtric-temporal does work, after the other dependencies are install with pixi
  3. platform linux-64
  4. work around was just to use pip to install using pixi tasks Configuration:
    
    [project]
    name = "graph_weather"
    version = "0.1.0"
    description = "Add a short description here"
    authors = ["Jacob Bieker <jacob@bieker.tech>"]
    channels = ["pyg", "nvidia", "conda-forge", "pytorch"]
    platforms = ["linux-64"]

[tasks] tinstall = "python -m pip install torch-geometric-temporal"

[dependencies] python = "3.11." torchvision = ">=0.16.1,<0.17" pytorch-cluster = ">=1.6.3,<1.7" pytorch-scatter = ">=2.1.2,<2.2" pytorch-cuda = "12.1." xarray = ">=2024.2.0,<2024.3" pytorch-spline-conv = ">=1.2.2,<1.3" pytorch = ">=2.1" pandas = ">=2.2.1,<2.3" h3-py = ">=3.7.6,<3.8" numcodecs = ">=0.12.1,<0.13" scipy = ">=1.12.0,<1.13" zarr = ">=2.17.0,<2.18" pyg = ">=2.5.0,<2.6" tqdm = ">=4.66.2,<4.67" pytorch-sparse = ">=0.6.18,<0.7" lightning = ">=2.2.0.post0,<2.2.1" einops = ">=0.7.0,<0.8" fsspec = ">=2024.2.0,<2024.3" datasets = ">=2.18.0,<2.19" pip = ">=24.0,<25"

[pypi-dependencies] pytest = "" # This means any version (this `` is custom in pixi) pre-commit = "" pysolar = ""

ruben-arts commented 6 months ago

Trying to install torch-geometric-temporal with pixi add. It fails as it cannot find torch, which had already been install previously

It does indeed not work. Testing the repro gives me this error:

  × failed to solve the pypi requirements of 'default' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ├─▶ Failed to download and build: torch-scatter==2.1.2
  ├─▶ Failed to build: torch-scatter==2.1.2
  ╰─▶ Build backend failed to determine extra requires with `build_wheel()`:
      --- stdout:

      --- stderr:
      Traceback (most recent call last):
        File "<string>", line 14, in <module>
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 325, in
      get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 295, in
      _get_build_requires
          self.run_setup()
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 487, in
      run_setup
          super().run_setup(setup_script=setup_script)
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 311, in
      run_setup
          exec(code, locals())
        File "<string>", line 8, in <module>
      ModuleNotFoundError: No module named 'torch'
      ---

In both the uv and rip workflow although import torch in the pixi run python does work for this pixi.toml.

Thanks for the info @jacobbieker

tdejager commented 6 months ago

Trying to install torch-geometric-temporal with pixi add. It fails as it cannot find torch, which had already been install previously

It does indeed not work. Testing the repro gives me this error:

  × failed to solve the pypi requirements of 'default' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ├─▶ Failed to download and build: torch-scatter==2.1.2
  ├─▶ Failed to build: torch-scatter==2.1.2
  ╰─▶ Build backend failed to determine extra requires with `build_wheel()`:
      --- stdout:

      --- stderr:
      Traceback (most recent call last):
        File "<string>", line 14, in <module>
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 325, in
      get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 295, in
      _get_build_requires
          self.run_setup()
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 487, in
      run_setup
          super().run_setup(setup_script=setup_script)
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 311, in
      run_setup
          exec(code, locals())
        File "<string>", line 8, in <module>
      ModuleNotFoundError: No module named 'torch'
      ---

In both the uv and rip workflow although import torch in the pixi run python does work for this pixi.toml.

Thanks for the info @jacobbieker

Would be kind-of useful if we can keep the build environments for uv as well, so it's easier to debug these things.

liblaf commented 6 months ago

What did you run and what was the outcome?

pixi add --pypi -vv "trimesh[all]"

What error did pixi return?

Log

```log INFO pixi::lock_file::outdated: the pypi dependencies of environment 'default' for platform linux-64 are out of date because the requirement 'trimesh[all]' could not be satisfied (required by '') INFO pixi::lock_file::update: updated conda packages in the 'default' prefix in 5ms 378us 132ns INFO resolve_pypi{group=default platform=linux-64}: pixi::lock_file::resolve: the following python packages are assumed to be installed by conda: libexpat 2.6.1, xz 5.2.6, readline 8.2, openssl 3.2.1, ld-impl-linux-64 2.40, libsqlite 3.45.1, libuuid 2.38.1, libffi 3.4.2, ncurses 6.4, bzip2 1.0.8, libzlib 1.2.13, tk 8.6.13, tzdata 2024a0, libxcrypt 4.4.36, libnsl 2.0.1, libgcc-ng 13.2.0, python 3.11.8, ca-certificates 2024.2.2, libgomp 13.2.0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: root @ 0a0.dev0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: trimesh[all] @ 4.2.0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: trimesh @ 4.2.0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: trimesh[test] @ 4.2.0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: trimesh[easy] @ 4.2.0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: trimesh[recommend] @ 4.2.0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: numpy @ 1.26.4 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: pytest-cov @ 4.1.0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: coveralls @ 3.3.1 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: mypy @ 1.9.0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: ezdxf @ 1.2.0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: pytest @ 8.1.1 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: pymeshlab @ 2022.2.post3 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: pyinstrument @ 4.6.2 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: matplotlib @ 3.8.3 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: ruff @ 0.3.2 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: typeguard @ 4.1.5 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: colorlog @ 6.8.2 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: mapbox-earcut @ 1.0.1 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: chardet @ 5.2.0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: lxml @ 5.1.0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: jsonschema @ 4.21.1 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: networkx @ 3.2.1 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: svg-path @ 6.3 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: pycollada @ 0.8 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: setuptools @ 69.1.1 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: shapely @ 2.0.3 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: xxhash @ 3.4.1 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: rtree @ 1.2.0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: httpx @ 0.27.0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: scipy @ 1.12.0 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: Start conflict resolution because incompat satisfied: embreex <2.17.7 | >2.17.7, <2.17.7.post1 | >2.17.7.post1, <2.17.7.post2 | >2.17.7.post2, <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post1 | >2.17.7.post1, <2.17.7.post2 | >2.17.7.post2, <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post2 | >2.17.7.post2, <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post4 | >2.17.7.post4 is forbidden INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: backtrack to DecisionLevel(5) INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: Start conflict resolution because incompat satisfied: embreex ==2.17.7.post4 is forbidden INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex * is forbidden INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: trimesh[easy] ==4.2.0 is forbidden INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: backtrack to DecisionLevel(2) INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: not adding trimesh[easy] @ 4.1.8 because of its dependencies INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: not adding trimesh[easy] @ 4.1.7 because of its dependencies INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: not adding trimesh[easy] @ 4.1.6 because of its dependencies ... INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: not adding trimesh[easy] @ 1.9.15 because of its dependencies INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: not adding trimesh[easy] @ 1.9.14 because of its dependencies INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: not adding trimesh[easy] @ 1.9.13 because of its dependencies INFO resolve_pypi{group=default platform=linux-64}:process_request{request=Metadata trimesh==1.9.12}:get_or_build_wheel_metadata{dist=trimesh==1.9.12}:build_source_dist_metadata:setup_build{package_id="trimesh==1.9.12" subdirectory=None}:solve: pubgrub::internal::partial_solution: add_decision: root @ 0a0.dev0 INFO resolve_pypi{group=default platform=linux-64}:process_request{request=Metadata trimesh==1.9.12}:get_or_build_wheel_metadata{dist=trimesh==1.9.12}:build_source_dist_metadata:setup_build{package_id="trimesh==1.9.12" subdirectory=None}:solve: pubgrub::internal::partial_solution: add_decision: setuptools @ 69.1.1 INFO resolve_pypi{group=default platform=linux-64}:process_request{request=Metadata trimesh==1.9.12}:get_or_build_wheel_metadata{dist=trimesh==1.9.12}:build_source_dist_metadata:setup_build{package_id="trimesh==1.9.12" subdirectory=None}:solve: pubgrub::internal::partial_solution: add_decision: wheel @ 0.43.0 × failed to solve the pypi requirements of 'default' 'linux-64' ├─▶ failed to resolve pypi dependencies ├─▶ Failed to download and build: trimesh==1.9.12 ├─▶ Failed to build: trimesh==1.9.12 ╰─▶ Build backend failed to determine extra requires with `build_wheel()` with exit status: 1 --- stdout: --- stderr: Traceback (most recent call last): File "", line 9, in ModuleNotFoundError: No module named 'pypandoc' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "", line 14, in File "/home/liblaf/.cache/rattler/cache/uv-cache/.tmpc8Jzlx/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 325, in get_requires_for_build_wheel return self._get_build_requires(config_settings, requirements=['wheel']) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/liblaf/.cache/rattler/cache/uv-cache/.tmpc8Jzlx/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 295, in _get_build_requires self.run_setup() File "/home/liblaf/.cache/rattler/cache/uv-cache/.tmpc8Jzlx/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 487, in run_setup super().run_setup(setup_script=setup_script) File "/home/liblaf/.cache/rattler/cache/uv-cache/.tmpc8Jzlx/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 311, in run_setup exec(code, locals()) File "", line 12, in FileNotFoundError: [Errno 2] No such file or directory: 'README.md' --- ```

Can pip install the package? Does pip install "trimesh[all]" work?

Yes.

What platform are you on?

linux-64

The above log shows that the problem seems to be with embreex, so I tried adding it directly, and got the following error:

$ pixi add --pypi -vv embreex
 INFO pixi::lock_file::outdated: the pypi dependencies of environment 'default' for platform linux-64 are out of date because the requirement 'embreex' could not be satisfied (required by '<environment>')
 INFO pixi::lock_file::update: updated conda packages in the 'default' prefix in 6ms 587us 9ns
 INFO resolve_pypi{group=default platform=linux-64}: pixi::lock_file::resolve: the following python packages are assumed to be installed by conda: libxcrypt 4.4.36, readline 8.2, libgomp 13.2.0, setuptools 69.1.1, xz 5.2.6, libnsl 2.0.1, tzdata 2024a0, libuuid 2.38.1, wheel 0.42.0, libgcc-ng 13.2.0, bzip2 1.0.8, ca-certificates 2024.2.2, libsqlite 3.45.1, ncurses 6.4, pip 24.0, openssl 3.2.1, ld-impl-linux-64 2.40, libzlib 1.2.13, libffi 3.4.2, libexpat 2.6.1, python 3.11.8, tk 8.6.13
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: root @ 0a0.dev0
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: Start conflict resolution because incompat satisfied:
   embreex <2.17.7 | >2.17.7, <2.17.7.post1 | >2.17.7.post1, <2.17.7.post2 | >2.17.7.post2, <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post1 | >2.17.7.post1, <2.17.7.post2 | >2.17.7.post2, <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post2 | >2.17.7.post2, <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post4 | >2.17.7.post4 is forbidden
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex * is forbidden
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: root ==0a0.dev0 is forbidden
  × failed to solve the pypi requirements of 'default' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ╰─▶ Because only the following versions of embreex are available:
          embreex==2.17.7
          embreex==2.17.7.post1
          embreex==2.17.7.post2
          embreex==2.17.7.post3
          embreex==2.17.7.post4
      and embreex==2.17.7 is unusable because no wheels are available with a matching platform, we can conclude that embreex<2.17.7.post1 cannot be used.
      And because embreex==2.17.7.post1 is unusable because no wheels are available with a matching platform, we can conclude that embreex<2.17.7.post2 cannot be used.
      And because embreex==2.17.7.post2 is unusable because no wheels are available with a matching platform and embreex==2.17.7.post3 is unusable because no wheels are
      available with a matching platform, we can conclude that embreex<2.17.7.post4 cannot be used.
      And because embreex==2.17.7.post4 is unusable because no wheels are available with a matching platform and you require embreex, we can conclude that the
      requirements are unsatisfiable.
evetion commented 6 months ago

What did you run and what was the outcome?

pixi add --platform osx-arm64 --pypi "meshkernel==4.1"

What error did pixi return?

  × failed to resolve `pypi-dependencies`, due to underlying error
  ╰─▶ The following packages are incompatible
      └─ meshkernel ==4.1 cannot be installed because there are no viable options:
         └─ meshkernel 4.1.0 is excluded because none of the artifacts are compatible with the Python interpreter or glibc version and there are no supported sdists

Can pip install the package?

pixi run pip install "meshkernel==4.1" works on the osx-arm64 platform.

What platform are you on?

This happens on the osx-64 and osx-arm64 platforms (it works on win-64 and linux-64). I expect this to be some mismatching in the name/metadata of/in the .whl files for the macos specific ones.

Did you find a workaround, if so please explain.

Not yet, it would help if the resolver can be more verbose (-vvv has no effect on it) how it matches the platform/interpreter/glibc version and what's incompatible. Possibly renaming the wheels on the meshkernel side would be enough to fix it.

Encountered in https://github.com/Deltares/Ribasim/pull/1137

wolfv commented 6 months ago

@evetion that error often means that the system-requirements are not set high enough. I can see that wheels are only available for macos = 14.0 (for arm64). We should do a better job at explaining this.

Unfortunately, setting that in my pixi.toml didn't work just now, so we might also still have to do some more debugging of the uv integration! I'll take a look :)

wolfv commented 6 months ago

Sorry, my bad. meshkernel actually does work fine when you add the following:

[system-requirements]
macos = "14.0"

We should make the error more actionable and figure out a better default behavior.

wolfv commented 6 months ago

@liblaf I wonder if for you it's a similar issue and adding glibc 2.28 or higher would fix it.

e.g.

[system-requirements]
glibc = "2.28"
liblaf commented 6 months ago

@liblaf I wonder if for you it's a similar issue and adding glibc 2.28 or higher would fix it.

e.g.

[system-requirements]
glibc = "2.28"

@wolfv thx! pixi add --pypi "trimesh[all]" now works for me with the following config:

[system-requirements]
libc = "2.39"
wolfv commented 6 months ago

Good to hear, thanks @liblaf! We should make this easier for users, and maybe use a more recent glibc / macos version as default.

evetion commented 6 months ago

Sorry, my bad. meshkernel actually does work fine when you add the following:

[system-requirements]
macos = "14.0"

We should make the error more actionable and figure out a better default behavior.

Thanks for the quick fix! This does indeed work. I'm looking forward to further improvements to pixi.

ruben-arts commented 5 months ago

The pypi support seems to be much better now. I'm closing this issue, feel free to open when you have a related issue.

pablovela5620 commented 5 months ago

The pypi support seems to be much better now. I'm closing this issue, feel free to open when you have a related issue.

@ruben-arts I'm still having some issues with a few packages. In particular pymeshlab when considering nerfstudio.

Here is a minimal example

[project]
name = "pymeshlab-test"
version = "0.1.0"
description = "Add a short description here"
authors = ["pablovela5620 <pablovela5620@gmail.com>"]
channels = ["conda-forge"]
platforms = ["linux-64"]

[tasks]

[dependencies]
python = "3.11.*"

[pypi-dependencies]
pymeshlab = "==2022.2.post4"

You can see below that the latest version is 2023.12.post1 but anything above ==2022.2.post4 fails pymeshlab

along with this, when using a bit more complicated of a pyproject.toml like I have for the recent dn-splatter, even ==2022.2post2 fails (which is required for the nerfstudio install)

[project]
name = "dn-splatter"
description = "Depth and normal priors for 3D Gaussian splatting and meshing"
version = "0.0.1"

dependencies = [
    "pymeshlab==2022.2.post2"
                ]

[tool.setuptools.packages.find]
include = ["dn_splatter*"]

[project.entry-points.'nerfstudio.method_configs']
dn_splatter = 'dn_splatter.dn_config:dn_splatter'
#g-nerfacto = 'dn_splatter.eval.eval_configs:gnerfacto'
#g-depthfacto = 'dn_splatter.eval.eval_configs:gdepthfacto'
#g-neusfacto = 'dn_splatter.eval.eval_configs:gneusfacto'

[project.entry-points.'nerfstudio.dataparser_configs']
mushroom = 'dn_splatter:MushroomDataParserSpecification'
replica = 'dn_splatter:ReplicaDataParserSpecification'
nrgbd = 'dn_splatter:NRGBDDataParserSpecification'
gsdf = 'dn_splatter:GSDFStudioDataParserSpecification'
scannetpp = 'dn_splatter:ScanNetppDataParserSpecification'
coolermap = 'dn_splatter:CoolerMapDataParserSpecification'
normal-nerfstudio = 'dn_splatter:NormalNerfstudioSpecification'

[project.scripts]
# export mesh scripts
gs-mesh = "dn_splatter.export_mesh:entrypoint"

[tool.pixi.project]
name = "dn-splatter"
version = "0.1.0"
description = "Depth and normal priors for 3D Gaussian splatting and meshing"
channels = ["nvidia/label/cuda-11.8.0", "nvidia", "conda-forge", "pytorch"]
platforms = ["linux-64"]

[tool.pixi.dependencies]
python = "3.10.*"
cuda = {version = "*", channel="nvidia/label/cuda-11.8.0"}
pytorch-cuda = {version = "11.8.*", channel="pytorch"}
pytorch = {version = ">=2.2.0,<2.3", channel="pytorch"}
torchvision = {version = ">=0.17.0,<0.18", channel="pytorch"}

[tool.pixi.pypi-dependencies]
dn-splatter = { path = ".", editable = true}

error

(default) pablo@pablo-ubuntu:~/0Dev/personal/forked-repos/dn-splatter$ pixi install
  × failed to solve the pypi requirements of 'default' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ╰─▶ Because pymeshlab==2022.2.post2 is unusable because no wheels are available with a matching Python ABI and you require pymeshlab==2022.2.post2, we can
      conclude that the requirements are unsatisfiable.
jleibs commented 2 months ago

I believe opencv-contrib-python is another package that is causing us grief.

My understanding is that the conda py-opencv package actually contains the full contents of opencv-contrib-python, however, when a package depends on opencv-contrib-python, we pull in the pypi dep instead.

➜  pixi list | grep opencv
libopencv                             4.10.0        qt6_py312h8d92a61_602  29 MiB     conda  libopencv-4.10.0-qt6_py312h8d92a61_602.conda
opencv                                4.10.0        qt6_py312hac6a15e_602  25.8 KiB   conda  opencv-4.10.0-qt6_py312hac6a15e_602.conda
py-opencv                             4.10.0        qt6_py312h071dcc1_602  1.1 MiB    conda  py-opencv-4.10.0-qt6_py312h071dcc1_602.conda

~/test 
➜  pixi add --pypi opencv-contrib-python
✔ Added opencv-contrib-python >=4.10.0.84, <4.10.1
Added these as pypi-dependencies.

~/test 
➜  pixi list | grep opencv
libopencv                             4.10.0        qt6_py312h8d92a61_602  29 MiB     conda  libopencv-4.10.0-qt6_py312h8d92a61_602.conda
opencv                                4.10.0        qt6_py312hac6a15e_602  25.8 KiB   conda  opencv-4.10.0-qt6_py312hac6a15e_602.conda
opencv_contrib_python                 4.10.0.84                            180 MiB    pypi   opencv_contrib_python-4.10.0.84-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.http.whl
py-opencv                             4.10.0        qt6_py312h071dcc1_602  1.1 MiB    conda  py-opencv-4.10.0-qt6_py312h071dcc1_602.conda
jamesscott-insitro commented 2 months ago

@jleibs Just want to add that there's also opencv-python-headless on pypi as well as opencv-python, while those are different builds of py-opencv on conda. So the mapping from conda to pip would be build dependent.

jleibs commented 2 months ago

Just want to add that there's also opencv-python-headless on pypi as well as opencv-python, while those are different builds of py-opencv on conda. So the mapping from conda to pip would be build dependent.

Opencv's packaging is such a mess. :-( I wish they could have just made these complementary packages with proper dependency graph.

Like opencv > opencv-contrib > opencv-headless.

tdejager commented 2 months ago

That's an interesting case, we should see if we could make a correct mapping for that one indeed.

baszalmstra commented 1 month ago

I believe opencv-contrib-python is another package that is causing us grief.

py-opencv the conda package depends on libopencv. The package libopencv actually contains 2 pypi packages, opencv_python_headless and opencv_python. However, our automated mapping only provides one pypi name per conda package so the opencv_python_headless is missed.

@nichmor What can we do about that?

jamesscott-insitro commented 1 month ago

I believe opencv-contrib-python is another package that is causing us grief.

py-opencv the conda package depends on libopencv. The package libopencv actually contains 2 pypi packages, opencv_python_headless and opencv_python. However, our automated mapping only provides one pypi name per conda package so the opencv_python_headless is missed.

@nichmor What can we do about that?

There is the conda-pypi-map in the project configuration, though currently I believe that requires remaking the entire conda->pip mapping. Or is there a way to overwrite/add a small number of packages?

baszalmstra commented 1 month ago

Unfortunately not. But I did find a bug in our mapping logic that I fixed in #1663 . With that fix the opencv issue is solved!

nichmor commented 1 month ago

opencv_python_headless

I've opened an issue to track and resolve this problem: https://github.com/prefix-dev/parselmouth/issues/11

henrikarhula commented 3 weeks ago

Hello, I encountered problems with this package: PyPI package qubovert The workaround brought a working solution, but is an undesirable way to do it.

What did you run and what was the outcome?

~\src\qubovert ❯ pixi add qubovert × failed to solve the conda requirements of 'default' 'win-64' ╰─▶ Cannot solve the request because of: No candidates were found for qubovert *.

~\src\qubovert took 6s ❯ pixi add qubovert --pypi × missing python interpreter from environment help: Use pixi add python to install the latest python interpreter.

~\src\qubovert took 2s ❯ pixi add python ✔ Added python >=3.12.5,<4

~\src\qubovert took 32s ❯ pixi add qubovert --pypi × error updating pypi prefix ├─▶ Failed to prepare distributions ├─▶ Failed to fetch wheel: qubovert==1.2.5 ├─▶ Failed to build: qubovert==1.2.5 ╰─▶ Build backend failed to build wheel through build_wheel() with exit code: 1

error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/

Can pip install the package? Yes

Does pip install packagex work? Yes

What platform are you on? win11

Did you find a workaround, if so please explain.

built into conda recipe with greyskull and "{{ stdlib("c") }}" as build requirement, see merged conda-forge PR

tdejager commented 3 weeks ago

Hello, I encountered problems with this package:

PyPI package qubovert

The workaround brought a working solution, but is an undesirable way to do it.

What did you run and what was the outcome?

~\src\qubovert

❯ pixi add qubovert

× failed to solve the conda requirements of 'default' 'win-64'

╰─▶ Cannot solve the request because of: No candidates were found for qubovert *.

~\src\qubovert took 6s

❯ pixi add qubovert --pypi

× missing python interpreter from environment

help: Use pixi add python to install the latest python interpreter.

~\src\qubovert took 2s

❯ pixi add python

✔ Added python >=3.12.5,<4

~\src\qubovert took 32s

❯ pixi add qubovert --pypi

× error updating pypi prefix

├─▶ Failed to prepare distributions

├─▶ Failed to fetch wheel: qubovert==1.2.5

├─▶ Failed to build: qubovert==1.2.5

╰─▶ Build backend failed to build wheel through build_wheel() with exit code: 1

error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/

Can pip install the package? Yes

Does pip install packagex work? Yes

What platform are you on? win11

Did you find a workaround, if so please explain.

built into conda recipe with greyskull and "{{ stdlib("c") }}" as build requirement, see merged conda-forge PR

When looking at the wheel tags, I think you are getting the wheel version for windows. But that is built for python3.9 you could try that version :) Otherwise you will need the msvc.

thewtex commented 3 weeks ago

pixi seems not to have wheel abi3 support.

Example pixi.toml:

[project]
authors = ["Matt McCormick <matt.mccormick@kitware.com>"]
channels = ["conda-forge"]
description = "Add a short description here"
name = "abi3-example"
platforms = ["linux-64"]
version = "0.1.0"

[tasks]

[dependencies]
python = ">=3.12.5,<4"

Then, when trying:

pixi add --pypi itk-filtering

We get:

  × failed to solve the pypi requirements of 'default' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ╰─▶ Because only the following versions of itk-filtering are available:
          itk-filtering<=5.2.1.post1
          itk-filtering==5.3.0
          itk-filtering>=5.4.0
      and itk-filtering<=5.0.1 has no wheels with a matching Python implementation tag, we can conclude that all of:
          itk-filtering<4.13.0
          itk-filtering>5.2.1.post1,<5.3.0
          itk-filtering>5.3.0,<5.4.0
       cannot be used.
      And because all of:
          itk-filtering>=5.1.0,<=5.2.1.post1
          itk-filtering==5.3.0
          itk-filtering>=5.4.0
      have no wheels with a matching Python ABI tag and you require itk-filtering, we can conclude that your requirements are unsatisfiable.

      hint: Pre-releases are available for itk-filtering in the requested range (e.g., 5.4rc4), but pre-releases weren't enabled (try: `--prerelease=allow`)

Even though an abi3 tagged wheel that works with Python 3.12 is available on PyPI.

Edit: this was not an issue with pixi 0.27.1, but it is observed with pixi 0.28.1.

CC @ruben-arts

urucoder commented 3 weeks ago

nvidia-cuda-nvrtc-cu11

Pixi can't install nvidia-cuda-nvrtc-cu11==11.8.89, it does work either with pip and poetry Minimal reproduction steps:

pixi init test
cd test
pixi add python
pixi add --pypi nvidia-cuda-nvrtc-cu11==11.8.89 -vvv

DEBUG pixi_config: Loading config from /etc/pixi/config.toml DEBUG pixi_config: Failed to load system config: /etc/pixi/config.toml (error: failed to read config from '/etc/pixi/config.toml') DEBUG pixi_config: Loading config from /home//.config/pixi/config.toml DEBUG pixi_config: Failed to load global config: /home//.config/pixi/config.toml (error: failed to read config from '/home//.config/pixi/config.toml') DEBUG pixi_config: Loading config from /home//.config/pixi/config.toml DEBUG pixi_config: Failed to load global config: /home//.config/pixi/config.toml (error: failed to read config from '/home//.config/pixi/config.toml') DEBUG pixi_config: Loading config from /home//.pixi/config.toml DEBUG pixi_config: Failed to load global config: /home//.pixi/config.toml (error: failed to read config from '/home//.pixi/config.toml') DEBUG pixi_config: Loading config from /home//test box/auth-test/.pixi/config.toml INFO pixi_config: Loaded config from: /home//test box/auth-test/.pixi/config.toml INFO pixi::environment: verifying prefix location is unchanged, with prefix file: /home//test box/auth-test/.pixi/envs/default/conda-meta/pixi_env_prefix DEBUG pixi::cli::add: environments affected by the add command: default INFO pixi::lock_file::outdated: the dependencies of environment 'default' for platform linux-64 are out of date because missing purls INFO pixi::lock_file::resolve::uv_resolution_context: using uv keyring (subprocess) provider INFO resolve_conda{group=default platform=linux-64}: pixi::lock_file::update: fetched 1797 records in 938.707901ms DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/
DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/
DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/
DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/
DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/
DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to 104.26.12.188:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to 104.26.12.188:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to 104.26.12.188:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to 104.26.12.188:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to 104.26.12.188:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connected to 104.26.12.188:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connected to 104.26.12.188:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connected to 104.26.12.188:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connected to 104.26.12.188:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connected to 104.26.12.188:443 DEBUG resolve_conda{group=default platform=linux-64}: h2::client: binding client connection DEBUG resolve_conda{group=default platform=linux-64}: h2::client: client connection bound DEBUG resolve_conda{group=default platform=linux-64}: h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: pooling idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=WindowUpdate { stream_id: StreamId(0), size_increment: 5177345 } DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(1), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(3), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG h2::client: binding client connection DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG h2::client: client connection bound DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(5), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(7), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(9), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Settings { flags: (0x0), max_concurrent_streams: 100, initial_window_size: 65536, max_frame_size: 16777215 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Settings { flags: (0x1: ACK) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=WindowUpdate { stream_id: StreamId(0), size_increment: 2147418112 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Settings { flags: (0x1: ACK) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::proto::settings: received settings ACK; applying Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(1), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(5), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(3), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(7), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(9), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://raw.githubusercontent.com/
DEBUG hyper_util::client::legacy::connect::dns: resolving host="raw.githubusercontent.com" DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to 185.199.111.133:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connected to 185.199.111.133:443 DEBUG resolve_conda{group=default platform=linux-64}: h2::client: binding client connection DEBUG resolve_conda{group=default platform=linux-64}: h2::client: client connection bound DEBUG resolve_conda{group=default platform=linux-64}: h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: pooling idle connection for ("https", raw.githubusercontent.com) DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=WindowUpdate { stream_id: StreamId(0), size_increment: 5177345 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(1), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Settings { flags: (0x0), max_concurrent_streams: 100 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Settings { flags: (0x1: ACK) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=WindowUpdate { stream_id: StreamId(0), size_increment: 16711681 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Settings { flags: (0x1: ACK) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::proto::settings: received settings ACK; applying Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(1), flags: (0x5: END_HEADERS | END_STREAM) } INFO pixi::lock_file::update: resolved conda environment for environment 'default' 'linux-64' in 1s 208ms 400us 27ns DEBUG pixi::rlimit: Attempted to set RLIMIT_NOFILE to 1024 but was already set to 1048576 INFO pixi::environment: Creating prefix file at: /home//test box/auth-test/.pixi/envs/default/conda-meta/pixi_env_prefix INFO pixi::environment: No update needed for the prefix file. INFO pixi::environment: Checking if history file exists: /home//test box/auth-test/.pixi/envs/default/conda-meta/history INFO pixi::lock_file::update: updated conda packages in the 'default' prefix in 7ms 680us 503ns INFO resolve_pypi{group=default platform=linux-64}: pixi::lock_file::resolve::pypi: there are no python packages installed by conda DEBUG resolve_pypi{group=default platform=linux-64}: pixi::lock_file::resolve::pypi: [Resolve] Using Python Interpreter: Interpreter { platform: Platform { os: Manylinux { major: 2, minor: 35 }, arch: X86_64 }, markers: MarkerEnvironment { inner: MarkerEnvironmentInner { implementation_name: "cpython", implementation_version: StringVersion { string: "3.12.5", version: "3.12.5" }, os_name: "posix", platform_machine: "x86_64", platform_python_implementation: "CPython", platform_release: "6.5.0-1023-azure", platform_system: "Linux", platform_version: "#24~22.04.1-Ubuntu SMP Wed Jun 12 19:55:26 UTC 2024", python_full_version: StringVersion { string: "3.12.5", version: "3.12.5" }, python_version: StringVersion { string: "3.12", version: "3.12" }, sys_platform: "linux" } }, scheme: Scheme { purelib: "/home//test box/auth-test/.pixi/envs/default/lib/python3.12/site-packages", platlib: "/home//test box/auth-test/.pixi/envs/default/lib/python3.12/site-packages", scripts: "/home//test box/auth-test/.pixi/envs/default/bin", data: "/home//test box/auth-test/.pixi/envs/default", include: "/home//test box/auth-test/.pixi/envs/default/include/python3.12" }, virtualenv: Scheme { purelib: "lib/python3.12/site-packages", platlib: "lib/python3.12/site-packages", scripts: "bin", data: "", include: "include/site/python3.12" }, manylinux_compatible: true, sys_prefix: "/home//test box/auth-test/.pixi/envs/default", sys_base_exec_prefix: "/home//test box/auth-test/.pixi/envs/default", sys_base_prefix: "/home//test box/auth-test/.pixi/envs/default", sys_base_executable: Some("/home//test box/auth-test/.pixi/envs/default/bin/python3.12"), sys_executable: "/home//test box/auth-test/.pixi/envs/default/bin/python3.12", sys_path: ["/home//.cache/rattler/cache/uv-cache/.tmpwhdlVM", "/home//test box/auth-test/.pixi/envs/default/lib/python312.zip", "/home//test box/auth-test/.pixi/envs/default/lib/python3.12", "/home//test box/auth-test/.pixi/envs/default/lib/python3.12/lib-dynload", "/home//test box/auth-test/.pixi/envs/default/lib/python3.12/site-packages"], stdlib: "/home//test box/auth-test/.pixi/envs/default/lib/python3.12", tags: OnceLock(), target: None, prefix: None, pointer_size: _64, gil_disabled: false } DEBUG resolve_pypi{group=default platform=linux-64}: uv_client::base_client: Using request timeout of 30s DEBUG solve: uv_resolver::resolver: Solving with installed Python version: 3.12.5 DEBUG solve: uv_resolver::resolver: Solving with target Python version: 3.12.5 DEBUG solve: uv_resolver::resolver: Adding direct dependency: nvidia-cuda-nvrtc-cu11==11.8.89 INFO solve: pubgrub::internal::partial_solution: add_decision: root @ 0a0.dev0
DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG resolve_pypi{group=default platform=linux-64}:process_request{request=Versions nvidia-cuda-nvrtc-cu11}:simple_api{package=nvidia-cuda-nvrtc-cu11}:get_cacheable: uv_client::cached_client: Found fresh response for: https://pypi.org/simple/nvidia-cuda-nvrtc-cu11/ DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG solve:choose_version{package=nvidia-cuda-nvrtc-cu11}: uv_resolver::resolver: Searching for a compatible version of nvidia-cuda-nvrtc-cu11 (==11.8.89) INFO solve: pubgrub::internal::core: Start conflict resolution because incompat satisfied: nvidia-cuda-nvrtc-cu11 ==11.8.89 is forbidden
INFO solve: pubgrub::internal::core: prior cause: root ==0a0.dev0 is forbidden
× failed to solve the pypi requirements of 'default' 'linux-64' ├─▶ failed to resolve pypi dependencies ╰─▶ Because nvidia-cuda-nvrtc-cu11==11.8.89 has no wheels with a matching platform tag and you require nvidia-cuda-nvrtc-cu11==11.8.89, we can conclude that your requirements are unsatisfiable. DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) }

liblaf commented 3 weeks ago

gmsh

  1. What did you run and what was the outcome?
$ pixi -vvv add --pypi gmsh
  1. What error did pixi return?
Log ```log DEBUG pixi_config: Loading config from /etc/pixi/config.toml DEBUG pixi_config: Failed to load system config: /etc/pixi/config.toml (error: failed to read config from '/etc/pixi/config.toml') DEBUG pixi_config: Loading config from /home/liblaf/.config/pixi/config.toml INFO pixi_config: Loaded config from: /home/liblaf/.config/pixi/config.toml DEBUG pixi_config: Loading config from /home/liblaf/.config/pixi/config.toml INFO pixi_config: Loaded config from: /home/liblaf/.config/pixi/config.toml DEBUG pixi_config: Loading config from /home/liblaf/.pixi/config.toml DEBUG pixi_config: Failed to load global config: /home/liblaf/.pixi/config.toml (error: failed to read config from '/home/liblaf/.pixi/config.toml') DEBUG pixi_config: Loading config from /tmp/hello/.pixi/config.toml DEBUG pixi_config: Failed to load local config: /tmp/hello/.pixi/config.toml (error: failed to read config from '/tmp/hello/.pixi/config.toml') INFO pixi::environment: verifying prefix location is unchanged, with prefix file: /tmp/hello/.pixi/envs/default/conda-meta/pixi_env_prefix DEBUG pixi::cli::add: environments affected by the add command: default INFO pixi::lock_file::outdated: environment 'default' is out of date because it does not exist in the lock-file. INFO pixi_utils::reqwest: Using mirrors: {Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge", query: None, fragment: None }: [Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("mirrors.tuna.tsinghua.edu.cn")), port: None, path: "/anaconda/cloud/conda-forge", query: None, fragment: None }], Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/anaconda", query: None, fragment: None }: [Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("mirrors.tuna.tsinghua.edu.cn")), port: None, path: "/anaconda/pkgs/main", query: None, fragment: None }], Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/pytorch3d", query: None, fragment: None }: [Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("mirrors.tuna.tsinghua.edu.cn")), port: None, path: "/anaconda/cloud/pytorch3d", query: None, fragment: None }], Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/pytorch", query: None, fragment: None }: [Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("mirrors.tuna.tsinghua.edu.cn")), port: None, path: "/anaconda/cloud/pytorch", query: None, fragment: None }]} INFO pixi::lock_file::resolve::uv_resolution_context: uv keyring provider is disabled WARN rattler_repodata_gateway::fetch: previous cache state does not contain cache_control header. Assuming out of date... DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}: rattler_repodata_gateway::fetch: fetching 'https://conda.anaconda.org/conda-forge/noarch/repodata.json.zst' DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}: reqwest::connect: starting new connection: https://mirrors.tuna.tsinghua.edu.cn/ DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}: reqwest::connect: proxy(http://127.0.0.1:64393) intercepts 'https://mirrors.tuna.tsinghua.edu.cn/' DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}: hyper_util::client::legacy::connect::http: connecting to 127.0.0.1:64393 DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}: hyper_util::client::legacy::connect::http: connected to 127.0.0.1:64393 DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}: h2::client: binding client connection DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}: h2::client: client connection bound DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}: h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}: hyper_util::client::legacy::pool: pooling idle connection for ("https", mirrors.tuna.tsinghua.edu.cn) DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}:Connection{peer=Client}: h2::codec::framed_write: send frame=WindowUpdate { stream_id: StreamId(0), size_increment: 5177345 } DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(1), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}:Connection{peer=Client}: h2::codec::framed_read: received frame=Settings { flags: (0x0), max_concurrent_streams: 128, initial_window_size: 65536, max_frame_size: 16777215 } DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}:Connection{peer=Client}: h2::codec::framed_write: send frame=Settings { flags: (0x1: ACK) } DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}:Connection{peer=Client}: h2::codec::framed_read: received frame=WindowUpdate { stream_id: StreamId(0), size_increment: 2147418112 } DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}:Connection{peer=Client}: h2::codec::framed_read: received frame=Settings { flags: (0x1: ACK) } DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}:Connection{peer=Client}: h2::proto::settings: received settings ACK; applying Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(1), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}: rattler_repodata_gateway::fetch: repodata was unmodified WARN rattler_repodata_gateway::fetch: previous cache state does not contain cache_control header. Assuming out of date... DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=Linux64}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}: rattler_repodata_gateway::fetch: fetching 'https://conda.anaconda.org/conda-forge/linux-64/repodata.json.zst' DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=Linux64}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}: hyper_util::client::legacy::pool: reuse idle connection for ("https", mirrors.tuna.tsinghua.edu.cn) DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(3), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=NoArch}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(3), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:get_or_create_subdir{channel=Channel { platforms: None, base_url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("conda.anaconda.org")), port: None, path: "/conda-forge/", query: None, fragment: None }, name: Some("conda-forge") } platform=Linux64}:fetch_repo_data{cache_path=/home/liblaf/.cache/rattler/cache/repodata}: rattler_repodata_gateway::fetch: repodata was unmodified INFO resolve_conda{group=default platform=linux-64}: pixi::lock_file::update: fetched 1800 records in 695.723379ms DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: proxy(http://127.0.0.1:64393) intercepts 'https://conda-mapping.prefix.dev/' DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to 127.0.0.1:64393 DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: proxy(http://127.0.0.1:64393) intercepts 'https://conda-mapping.prefix.dev/' DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to 127.0.0.1:64393 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connected to 127.0.0.1:64393 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connected to 127.0.0.1:64393 DEBUG resolve_conda{group=default platform=linux-64}: h2::client: binding client connection DEBUG resolve_conda{group=default platform=linux-64}: h2::client: client connection bound DEBUG resolve_conda{group=default platform=linux-64}: h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: pooling idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=WindowUpdate { stream_id: StreamId(0), size_increment: 5177345 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(1), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(3), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Settings { flags: (0x0), max_concurrent_streams: 100, initial_window_size: 65536, max_frame_size: 16777215 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Settings { flags: (0x1: ACK) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=WindowUpdate { stream_id: StreamId(0), size_increment: 2147418112 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Settings { flags: (0x1: ACK) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::proto::settings: received settings ACK; applying Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(1), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(3), flags: (0x4: END_HEADERS) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Data { stream_id: StreamId(3) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Data { stream_id: StreamId(3), flags: (0x1: END_STREAM) } INFO pixi::lock_file::update: resolved conda environment for environment 'default' 'linux-64' in 1s 445ms 554us 945ns DEBUG pixi::rlimit: Attempted to set RLIMIT_NOFILE to 1024 but was already set to 524288 INFO pixi::environment: Creating prefix file at: /tmp/hello/.pixi/envs/default/conda-meta/pixi_env_prefix INFO pixi::environment: No update needed for the prefix file. INFO pixi::environment: Checking if history file exists: /tmp/hello/.pixi/envs/default/conda-meta/history INFO pixi::lock_file::update: updated conda packages in the 'default' prefix in 3ms 313us 277ns INFO resolve_pypi{group=default platform=linux-64}: pixi::lock_file::resolve::pypi: there are no python packages installed by conda DEBUG resolve_pypi{group=default platform=linux-64}: pixi::lock_file::resolve::pypi: [Resolve] Using Python Interpreter: Interpreter { platform: Platform { os: Manylinux { major: 2, minor: 40 }, arch: X86_64 }, markers: MarkerEnvironment { inner: MarkerEnvironmentInner { implementation_name: "cpython", implementation_version: StringVersion { string: "3.12.5", version: "3.12.5" }, os_name: "posix", platform_machine: "x86_64", platform_python_implementation: "CPython", platform_release: "6.10.6-arch1-1", platform_system: "Linux", platform_version: "#1 SMP PREEMPT_DYNAMIC Mon, 19 Aug 2024 17:02:39 +0000", python_full_version: StringVersion { string: "3.12.5", version: "3.12.5" }, python_version: StringVersion { string: "3.12", version: "3.12" }, sys_platform: "linux" } }, scheme: Scheme { purelib: "/tmp/hello/.pixi/envs/default/lib/python3.12/site-packages", platlib: "/tmp/hello/.pixi/envs/default/lib/python3.12/site-packages", scripts: "/tmp/hello/.pixi/envs/default/bin", data: "/tmp/hello/.pixi/envs/default", include: "/tmp/hello/.pixi/envs/default/include/python3.12" }, virtualenv: Scheme { purelib: "lib/python3.12/site-packages", platlib: "lib/python3.12/site-packages", scripts: "bin", data: "", include: "include/site/python3.12" }, manylinux_compatible: true, sys_prefix: "/tmp/hello/.pixi/envs/default", sys_base_exec_prefix: "/tmp/hello/.pixi/envs/default", sys_base_prefix: "/tmp/hello/.pixi/envs/default", sys_base_executable: Some("/tmp/hello/.pixi/envs/default/bin/python3.12"), sys_executable: "/tmp/hello/.pixi/envs/default/bin/python3.12", sys_path: ["/home/liblaf/.cache/rattler/cache/uv-cache/.tmpoAtky4", "/tmp/hello/.pixi/envs/default/lib/python312.zip", "/tmp/hello/.pixi/envs/default/lib/python3.12", "/tmp/hello/.pixi/envs/default/lib/python3.12/lib-dynload", "/tmp/hello/.pixi/envs/default/lib/python3.12/site-packages"], stdlib: "/tmp/hello/.pixi/envs/default/lib/python3.12", tags: OnceLock(), target: None, prefix: None, pointer_size: _64, gil_disabled: false } DEBUG resolve_pypi{group=default platform=linux-64}: uv_client::base_client: Using request timeout of 30s DEBUG solve: uv_resolver::resolver: Solving with installed Python version: 3.12.5 DEBUG solve: uv_resolver::resolver: Solving with target Python version: 3.12.5 DEBUG solve: uv_resolver::resolver: Adding direct dependency: hello* DEBUG solve: uv_resolver::resolver: Adding direct dependency: gmsh* INFO solve: pubgrub::internal::partial_solution: add_decision: root @ 0a0.dev0 DEBUG solve:choose_version{package=hello}: uv_resolver::resolver: Searching for a compatible version of hello @ file:///tmp/hello (*) DEBUG uv_fs: Acquired lock for `/home/liblaf/.cache/rattler/cache/uv-cache/built-wheels-v3/editable/9668592e58e88e25` DEBUG resolve_pypi{group=default platform=linux-64}:process_request{request=Versions gmsh}:simple_api{package=gmsh}:get_cacheable: uv_client::cached_client: Found fresh response for: https://pypi.org/simple/gmsh/ DEBUG resolve_pypi{group=default platform=linux-64}:process_request{request=Metadata hello @ file:///tmp/hello}:get_or_build_wheel_metadata{dist=hello @ file:///tmp/hello}: uv_distribution::source: No static `PKG-INFO` available for: hello @ file:///tmp/hello (MissingPkgInfo) DEBUG resolve_pypi{group=default platform=linux-64}:process_request{request=Metadata hello @ file:///tmp/hello}:get_or_build_wheel_metadata{dist=hello @ file:///tmp/hello}: uv_distribution::source: Found static `pyproject.toml` for: hello @ file:///tmp/hello INFO solve: pubgrub::internal::partial_solution: add_decision: hello @ 0.1.0 DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (*) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.9.1 | >4.9.1, <4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.9.0.post1 | >4.9.0.post1, <4.9.1 | >4.9.1, <4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.9.0 | >4.9.0, <4.9.0.post1 | >4.9.0.post1, <4.9.1 | >4.9.1, <4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve: uv_resolver::resolver: No compatible version found for: gmsh INFO solve: pubgrub::internal::core: Start conflict resolution because incompat satisfied: gmsh <4.9.0 | >4.9.0, <4.9.0.post1 | >4.9.0.post1, <4.9.1 | >4.9.1, <4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.9.0.post1 | >4.9.0.post1, <4.9.1 | >4.9.1, <4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.9.1 | >4.9.1, <4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: backtrack to DecisionLevel(1) INFO solve: pubgrub::internal::core: Start conflict resolution because incompat satisfied: gmsh ==4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh * is forbidden INFO solve: pubgrub::internal::core: prior cause: root ==0a0.dev0 is forbidden × failed to solve the pypi requirements of 'default' 'linux-64' ├─▶ failed to resolve pypi dependencies ╰─▶ Because only the following versions of gmsh are available: gmsh==4.9.0 gmsh==4.9.0.post1 gmsh==4.9.1 gmsh==4.9.2 gmsh==4.9.3 gmsh==4.9.4 gmsh==4.9.5 gmsh==4.10.0 gmsh==4.10.1 gmsh==4.10.2 gmsh==4.10.3 gmsh==4.10.4 gmsh==4.10.5 gmsh==4.11.0 gmsh==4.11.1 gmsh==4.12.0 gmsh==4.12.1 gmsh==4.12.2 gmsh==4.13.0 gmsh==4.13.1 and all versions of gmsh have no wheels with a matching platform tag, we can conclude that gmsh<4.9.0.post1 cannot be used. And because you require gmsh, we can conclude that your requirements are unsatisfiable. ```
  1. Can pip install the package?

Yes.

  1. What platform are you on?

linux-x64.

  1. Did you find a workaround, if so please explain.

No.

tdejager commented 3 weeks ago

@liblaf for your problem you could try setting the: https://pixi.sh/latest/reference/project_configuration/#the-system-requirements-table to 2.24. Which is what I see for the wheel tag glibc version.

tdejager commented 3 weeks ago

@urucoder in your case I would try glibc 2.17 I think.

liblaf commented 3 weeks ago

@tdejager Unfortunately it won't work.

pyproject.toml ```toml [build-system] build-backend = "hatchling.build" requires = ["hatchling"] [project] authors = [ { email = "30631553+liblaf@users.noreply.github.com", name = "liblaf" }, ] dependencies = [] description = "Add a short description here" name = "hello" requires-python = ">= 3.11" version = "0.1.0" [tool.pixi.project] channels = ["conda-forge"] platforms = ["linux-64"] [tool.pixi.pypi-dependencies] hello = { editable = true, path = "." } [tool.pixi.system-requirements] libc = { family = "glibc", version = "2.35" } linux = "6.5" [tool.pixi.tasks] ```
pixi add --pypi gmsh ```log DEBUG pixi_config: Loading config from /etc/pixi/config.toml DEBUG pixi_config: Failed to load system config: /etc/pixi/config.toml (error: failed to read config from '/etc/pixi/config.toml') DEBUG pixi_config: Loading config from /home/liblaf/.config/pixi/config.toml INFO pixi_config: Loaded config from: /home/liblaf/.config/pixi/config.toml DEBUG pixi_config: Loading config from /home/liblaf/.config/pixi/config.toml INFO pixi_config: Loaded config from: /home/liblaf/.config/pixi/config.toml DEBUG pixi_config: Loading config from /home/liblaf/.pixi/config.toml DEBUG pixi_config: Failed to load global config: /home/liblaf/.pixi/config.toml (error: failed to read config from '/home/liblaf/.pixi/config.toml') DEBUG pixi_config: Loading config from /tmp/hello/.pixi/config.toml DEBUG pixi_config: Failed to load local config: /tmp/hello/.pixi/config.toml (error: failed to read config from '/tmp/hello/.pixi/config.toml') INFO pixi::environment: verifying prefix location is unchanged, with prefix file: /tmp/hello/.pixi/envs/default/conda-meta/pixi_env_prefix DEBUG pixi::cli::add: environments affected by the add command: default INFO pixi::lock_file::outdated: environment 'default' is out of date because it does not exist in the lock-file. INFO pixi::lock_file::resolve::uv_resolution_context: uv keyring provider is disabled INFO resolve_conda{group=default platform=linux-64}: pixi::lock_file::update: fetched 1801 records in 643.712044ms DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: reqwest::connect: starting new connection: https://conda-mapping.prefix.dev/ DEBUG hyper_util::client::legacy::connect::dns: resolving host="conda-mapping.prefix.dev" DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::ac43:4867]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connected to [2606:4700:20::ac43:4867]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:dbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:dbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::ac43:4867]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:dbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::ac43:4867]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:dbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:dbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:dbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::ac43:4867]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::connect::http: connecting to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}: h2::client: binding client connection DEBUG resolve_conda{group=default platform=linux-64}: h2::client: client connection bound DEBUG resolve_conda{group=default platform=linux-64}: h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: pooling idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}: hyper_util::client::legacy::pool: reuse idle connection for ("https", conda-mapping.prefix.dev) DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=WindowUpdate { stream_id: StreamId(0), size_increment: 5177345 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(1), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(3), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(5), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(7), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(9), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(11), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(13), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(15), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(17), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(19), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(21), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(23), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(25), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(27), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(29), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(31), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(33), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(35), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(37), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Headers { stream_id: StreamId(39), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=WindowUpdate { stream_id: StreamId(0), size_increment: 5177345 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:dbc]:443 DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::ac43:4867]:443 DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:dbc]:443 DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:cbc]:443 DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::ac43:4867]:443 DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:cbc]:443 DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:dbc]:443 DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:dbc]:443 DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:dbc]:443 DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:cbc]:443 DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:cbc]:443 DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:dbc]:443 DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::ac43:4867]:443 DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:cbc]:443 DEBUG hyper_util::client::legacy::connect::http: connected to [2606:4700:20::681a:cbc]:443 DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Settings { flags: (0x0), max_concurrent_streams: 100, initial_window_size: 65536, max_frame_size: 16777215 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_write: send frame=Settings { flags: (0x1: ACK) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=WindowUpdate { stream_id: StreamId(0), size_increment: 2147418112 } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Settings { flags: (0x1: ACK) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::proto::settings: received settings ACK; applying Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG h2::client: binding client connection DEBUG h2::client: client connection bound DEBUG h2::codec::framed_write: send frame=Settings { flags: (0x0), enable_push: 0, initial_window_size: 2097152, max_frame_size: 16384, max_header_list_size: 16384 } DEBUG Connection{peer=Client}: h2::codec::framed_write: send frame=GoAway { error_code: NO_ERROR, last_stream_id: StreamId(0) } DEBUG Connection{peer=Client}: h2::proto::connection: Connection::poll; connection error error=GoAway(b"", NO_ERROR, Library) DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(35), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(31), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(19), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(23), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(27), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(33), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(37), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(29), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(3), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(39), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(7), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(21), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(11), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(1), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(15), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(25), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(5), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(13), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(17), flags: (0x5: END_HEADERS | END_STREAM) } DEBUG resolve_conda{group=default platform=linux-64}:Connection{peer=Client}: h2::codec::framed_read: received frame=Headers { stream_id: StreamId(9), flags: (0x5: END_HEADERS | END_STREAM) } INFO pixi::lock_file::update: resolved conda environment for environment 'default' 'linux-64' in 892ms 96us 317ns DEBUG pixi::rlimit: Increased RLIMIT_NOFILE to 1024 INFO pixi::environment: Creating prefix file at: /tmp/hello/.pixi/envs/default/conda-meta/pixi_env_prefix INFO pixi::environment: No update needed for the prefix file. INFO pixi::environment: Checking if history file exists: /tmp/hello/.pixi/envs/default/conda-meta/history INFO pixi::lock_file::update: updated conda packages in the 'default' prefix in 3ms 101us 542ns INFO resolve_pypi{group=default platform=linux-64}: pixi::lock_file::resolve::pypi: there are no python packages installed by conda DEBUG resolve_pypi{group=default platform=linux-64}: pixi::lock_file::resolve::pypi: [Resolve] Using Python Interpreter: Interpreter { platform: Platform { os: Manylinux { major: 2, minor: 40 }, arch: X86_64 }, markers: MarkerEnvironment { inner: MarkerEnvironmentInner { implementation_name: "cpython", implementation_version: StringVersion { string: "3.12.5", version: "3.12.5" }, os_name: "posix", platform_machine: "x86_64", platform_python_implementation: "CPython", platform_release: "6.10.6-arch1-1", platform_system: "Linux", platform_version: "#1 SMP PREEMPT_DYNAMIC Mon, 19 Aug 2024 17:02:39 +0000", python_full_version: StringVersion { string: "3.12.5", version: "3.12.5" }, python_version: StringVersion { string: "3.12", version: "3.12" }, sys_platform: "linux" } }, scheme: Scheme { purelib: "/tmp/hello/.pixi/envs/default/lib/python3.12/site-packages", platlib: "/tmp/hello/.pixi/envs/default/lib/python3.12/site-packages", scripts: "/tmp/hello/.pixi/envs/default/bin", data: "/tmp/hello/.pixi/envs/default", include: "/tmp/hello/.pixi/envs/default/include/python3.12" }, virtualenv: Scheme { purelib: "lib/python3.12/site-packages", platlib: "lib/python3.12/site-packages", scripts: "bin", data: "", include: "include/site/python3.12" }, manylinux_compatible: true, sys_prefix: "/tmp/hello/.pixi/envs/default", sys_base_exec_prefix: "/tmp/hello/.pixi/envs/default", sys_base_prefix: "/tmp/hello/.pixi/envs/default", sys_base_executable: Some("/tmp/hello/.pixi/envs/default/bin/python3.12"), sys_executable: "/tmp/hello/.pixi/envs/default/bin/python3.12", sys_path: ["/home/liblaf/.cache/rattler/cache/uv-cache/.tmpiK9Rny", "/tmp/hello/.pixi/envs/default/lib/python312.zip", "/tmp/hello/.pixi/envs/default/lib/python3.12", "/tmp/hello/.pixi/envs/default/lib/python3.12/lib-dynload", "/tmp/hello/.pixi/envs/default/lib/python3.12/site-packages"], stdlib: "/tmp/hello/.pixi/envs/default/lib/python3.12", tags: OnceLock(), target: None, prefix: None, pointer_size: _64, gil_disabled: false } DEBUG resolve_pypi{group=default platform=linux-64}: uv_client::base_client: Using request timeout of 30s DEBUG solve: uv_resolver::resolver: Solving with installed Python version: 3.12.5 DEBUG solve: uv_resolver::resolver: Solving with target Python version: 3.12.5 DEBUG solve: uv_resolver::resolver: Adding direct dependency: hello* DEBUG solve: uv_resolver::resolver: Adding direct dependency: gmsh* INFO solve: pubgrub::internal::partial_solution: add_decision: root @ 0a0.dev0 DEBUG solve:choose_version{package=hello}: uv_resolver::resolver: Searching for a compatible version of hello @ file:///tmp/hello (*) DEBUG uv_fs: Acquired lock for `/home/liblaf/.cache/rattler/cache/uv-cache/built-wheels-v3/editable/9668592e58e88e25` DEBUG resolve_pypi{group=default platform=linux-64}:process_request{request=Versions gmsh}:simple_api{package=gmsh}:get_cacheable: uv_client::cached_client: Found fresh response for: https://pypi.org/simple/gmsh/ DEBUG resolve_pypi{group=default platform=linux-64}:process_request{request=Metadata hello @ file:///tmp/hello}:get_or_build_wheel_metadata{dist=hello @ file:///tmp/hello}: uv_distribution::source: No static `PKG-INFO` available for: hello @ file:///tmp/hello (MissingPkgInfo) DEBUG resolve_pypi{group=default platform=linux-64}:process_request{request=Metadata hello @ file:///tmp/hello}:get_or_build_wheel_metadata{dist=hello @ file:///tmp/hello}: uv_distribution::source: Found static `pyproject.toml` for: hello @ file:///tmp/hello INFO solve: pubgrub::internal::partial_solution: add_decision: hello @ 0.1.0 DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (*) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.9.1 | >4.9.1, <4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.9.0.post1 | >4.9.0.post1, <4.9.1 | >4.9.1, <4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve:choose_version{package=gmsh}: uv_resolver::resolver: Searching for a compatible version of gmsh (<4.9.0 | >4.9.0, <4.9.0.post1 | >4.9.0.post1, <4.9.1 | >4.9.1, <4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1) DEBUG solve: uv_resolver::resolver: No compatible version found for: gmsh INFO solve: pubgrub::internal::core: Start conflict resolution because incompat satisfied: gmsh <4.9.0 | >4.9.0, <4.9.0.post1 | >4.9.0.post1, <4.9.1 | >4.9.1, <4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.9.0.post1 | >4.9.0.post1, <4.9.1 | >4.9.1, <4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.9.1 | >4.9.1, <4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.9.2 | >4.9.2, <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.9.3 | >4.9.3, <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.9.4 | >4.9.4, <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.9.5 | >4.9.5, <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.10.0 | >4.10.0, <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.10.1 | >4.10.1, <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.10.2 | >4.10.2, <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.10.3 | >4.10.3, <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.10.4 | >4.10.4, <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.10.5 | >4.10.5, <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.11.0 | >4.11.0, <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.11.1 | >4.11.1, <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.12.0 | >4.12.0, <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.12.1 | >4.12.1, <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.12.2 | >4.12.2, <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.13.0 | >4.13.0, <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh <4.13.1 | >4.13.1 is forbidden INFO solve: pubgrub::internal::core: backtrack to DecisionLevel(1) INFO solve: pubgrub::internal::core: Start conflict resolution because incompat satisfied: gmsh ==4.13.1 is forbidden INFO solve: pubgrub::internal::core: prior cause: gmsh * is forbidden INFO solve: pubgrub::internal::core: prior cause: root ==0a0.dev0 is forbidden × failed to solve the pypi requirements of 'default' 'linux-64' ├─▶ failed to resolve pypi dependencies ╰─▶ Because only the following versions of gmsh are available: gmsh==4.9.0 gmsh==4.9.0.post1 gmsh==4.9.1 gmsh==4.9.2 gmsh==4.9.3 gmsh==4.9.4 gmsh==4.9.5 gmsh==4.10.0 gmsh==4.10.1 gmsh==4.10.2 gmsh==4.10.3 gmsh==4.10.4 gmsh==4.10.5 gmsh==4.11.0 gmsh==4.11.1 gmsh==4.12.0 gmsh==4.12.1 gmsh==4.12.2 gmsh==4.13.0 gmsh==4.13.1 and all versions of gmsh have no wheels with a matching platform tag, we can conclude that gmsh<4.9.0.post1 cannot be used. And because you require gmsh, we can conclude that your requirements are unsatisfiable. ```
tdejager commented 3 weeks ago

@liblaf @urucoder I think I was mixing up glibc forwards and backwards compatibility again.. I might have found something else.

liblaf commented 3 weeks ago

@tdejager I wrote a simple script to test which PyPI packages cannot be managed by pixi. I have only tested a few packages I use. Here is the results from workflow run:

Results | Package | pip | pixi | uv | | --- | --- | --- | --- | | confz | ✅ | ✅ | ✅ | | dvc | ✅ | ❌ | ✅ | | dvclive | ✅ | ❌ | ✅ | | icecream | ✅ | ✅ | ✅ | | jax | ✅ | ❌ | ✅ | | jax[cuda12] | ✅ | ❌ | ✅ | | matplotlib | ✅ | ✅ | ✅ | | meshio[all] | ✅ | ❌ | ✅ | | meshpy | ✅ | ✅ | ✅ | | meshtaichi-patcher | ✅ | ✅ | ✅ | | mkdocs | ✅ | ✅ | ✅ | | mkdocs-material | ✅ | ✅ | ✅ | | mkdocstrings[python] | ✅ | ✅ | ✅ | | numpy | ✅ | ✅ | ✅ | | pymeshfix | ✅ | ✅ | ✅ | | pyright | ✅ | ✅ | ✅ | | pytest | ✅ | ✅ | ✅ | | pytest-benchmark | ✅ | ✅ | ✅ | | pyvista | ✅ | ✅ | ✅ | | ruff | ✅ | ✅ | ✅ | | scipy | ✅ | ❌ | ✅ | | sparse | ✅ | ❌ | ❌ | | taichi | ✅ | ❌ | ✅ | | trimesh[all] | ✅ | ❌ | ✅ | | typeguard | ✅ | ✅ | ✅ |

I also noticed that pixi tends to build from source distributions rather than using built distributions. If the PyPI package does not provide source distributions, it cannot be installed (e.g. gmsh, jaxlib). I guess this is because pixi fails to resolve built distributions correctly.

tdejager commented 3 weeks ago

@liblaf Thank you for that table. I've made a PR that hopefully alleviates the gmsh problems at least: https://github.com/prefix-dev/pixi/pull/1925.

Yeah I think there may be more tag issues, that disallow certain wheels that might be allowed. We should look into that more.