pypa / hatch

Modern, extensible Python project management
https://hatch.pypa.io/latest/
MIT License
5.52k stars 270 forks source link

[BUG] Strange behavior when using UV installer #1474

Closed mikita-sakalouski closed 3 weeks ago

mikita-sakalouski commented 3 weeks ago

Current behavior

I'm trying to have 3 test environments using the following config:

[version]
path = "koheesio/__init__.py"

[envs.default]
installer = "pip"

[[envs.test.matrix]]
python = ["3.10"]
pyspark = ["33", "34", "35"]

[envs.test]
dependencies = [
    "coverage[toml]",
    "pytest",
    "pytest-cov",
    "requests_mock",
    "pytest-mock",
    "pytest-sftpserver",
    "pytest-order",
    "pytest-venv",
    "pytest-xdist",
]
# features = ["box", "cerberus", "pyspark", "se", "sftp", "delta"]

[envs.test.scripts]
log_versions = "python --version && pip freeze | grep pyspark"
run-coverage = "pytest --cov-config=pyproject.toml --cov=pkg --cov=tests"
run = "run-coverage --no-cov"

[envs."test.py3.10-33"]
python = "3.10"
template = "test"
extra-dependencies = ["pyspark>=3.3,<3.4"]

[envs."test.py3.10-34"]
python = "3.10"
template = "test"
extra-dependencies = ["pyspark>=3.4,<3.5"]

[envs."test.py3.10-35"]
python = "3.10"
template = "test"
extra-dependencies = ["pyspark>=3.5"]

When running command

>>>  hatch run test:log_versions
──────────── test.py3.10-33───────────────────────────
Python 3.10.14
pyspark==3.3.4
──────────── test.py3.10-34───────────────────────────
Python 3.10.14
pyspark==3.4.3
──────────── test.py3.10-35───────────────────────────
Python 3.10.14
pyspark==3.5.1

After changing

[envs.default]
installer = "pip"

to

[envs.default]
installer = "uv"

And removing virtual environments, together with metadata of hatch, I'm getting:

>>>  hatch run test:log_versions
──────────── test.py3.10-33────────────────────────────
Python 3.10.14
WARNING: Skipping /opt/homebrew/lib/python3.11/site-packages/packaging-24.0.dist-info due to invalid metadata entry 'name'
WARNING: Skipping /opt/homebrew/lib/python3.11/site-packages/typing_extensions-4.10.0.dist-info due to invalid metadata entry 'name'
pyspark==3.4.0
──────────── test.py3.10-34────────────────────────────
Python 3.10.14
WARNING: Skipping /opt/homebrew/lib/python3.11/site-packages/packaging-24.0.dist-info due to invalid metadata entry 'name'
WARNING: Skipping /opt/homebrew/lib/python3.11/site-packages/typing_extensions-4.10.0.dist-info due to invalid metadata entry 'name'
pyspark==3.4.0
──────────── test.py3.10-35────────────────────────────
Python 3.10.14
WARNING: Skipping /opt/homebrew/lib/python3.11/site-packages/packaging-24.0.dist-info due to invalid metadata entry 'name'
WARNING: Skipping /opt/homebrew/lib/python3.11/site-packages/typing_extensions-4.10.0.dist-info due to invalid metadata entry 'name'
pyspark==3.4.0

Expected behavior

Have correct version of spark when using UV.

Additional context

Debug

Installation

Configuration

mode = "local"
shell = ""

[dirs.env]
virtual = ".venvs"
ofek commented 3 weeks ago

Thank you for using the new report command, you're the first!

I can't reproduce that particular error but I can experience the grep failing because there is no pip in the virtual environment only the parent environment and so the one in the parent has no pyspark. Try changing that script to:

log_versions = "python --version && $HATCH_UV pip freeze | grep pyspark"

That environment variable is called out here: https://hatch.pypa.io/latest/how-to/environment/select-installer/#enabling-uv

mikita-sakalouski commented 3 weeks ago

@ofek Oh, man ! That helps, after switching command to: python --version && {env:HATCH_UV} pip freeze | grep pyspark and

[tool.hatch.envs.default]
installer = "uv"

getting:


──────────────────────────────────────────────── test.py3.8-pyspark33 ─────────────────────────────────────────────────
Python 3.8.19
pyspark==3.3.4
──────────────────────────────────────────────── test.py3.8-pyspark34 ─────────────────────────────────────────────────
Python 3.8.19
pyspark==3.4.3
──────────────────────────────────────────────── test.py3.9-pyspark33 ─────────────────────────────────────────────────
Python 3.9.19
pyspark==3.3.4
──────────────────────────────────────────────── test.py3.9-pyspark34 ─────────────────────────────────────────────────
Python 3.9.19
pyspark==3.4.3
──────────────────────────────────────────────── test.py3.10-pyspark33 ────────────────────────────────────────────────
Python 3.10.14
pyspark==3.3.4
──────────────────────────────────────────────── test.py3.10-pyspark34 ────────────────────────────────────────────────
Python 3.10.14
pyspark==3.4.3
──────────────────────────────────────────────── test.py3.10-pyspark35 ────────────────────────────────────────────────
Python 3.10.14
pyspark==3.5.1
──────────────────────────────────────────────── test.py3.11-pyspark35 ────────────────────────────────────────────────
Python 3.11.9
pyspark==3.5.1
──────────────────────────────────────────────── test.py3.12-pyspark35 ────────────────────────────────────────────────
Python 3.12.3
pyspark==3.5.1

Now I have to make it work without specifying hardcoded versions of python in each env... but it is totally different story