Closed kloczek closed 2 years ago
Just tested 1.10.10 and pytest is failing the same way.
From your path it looks like you're building your own pytest as well, but I'm guessing without installing it.
pydeps' requirements.txt file contains all requirements for building and testing the package, and it is expected that you'll run pip install -r requirements.txt
before running pytest
. The requirements.txt file contains the version of pytest that pydeps is tested with.
Adding PYTHONPATH to the package paths that pydeps searches doesn't seem correct, and this test has recently solved a number of issues with different site-packages paths on linux and containers so I'm not too keen on removing it (it tests that the pydeps tests can find the path that pydeps' requirements.txt dependencies were installed to).
To verify further, can you please run the following and let me know the output:
$> which pytest
$> python -c "import pytest;print(pytest.__file__)"
$> python -c "import site;print(site.getusersitepackages())"
$> python -c "import site;print(site.getsitepackages())"
Here it is result ..
[tkloczko@barrel SPECS]$ which pytest
/usr/bin/pytest
[tkloczko@barrel SPECS]$ python3 -c "import pytest;print(pytest.__file__)"
/usr/lib/python3.8/site-packages/pytest/__init__.py
[tkloczko@barrel SPECS]$ python3 -c "import site;print(site.getusersitepackages())"
/home/tkloczko/.local/lib/python3.8/site-packages
[tkloczko@barrel SPECS]$ ls -l /home/tkloczko/.local/lib/python3.8/site-packages
ls: cannot access '/home/tkloczko/.local/lib/python3.8/site-packages': No such file or directory
[tkloczko@barrel SPECS]$ python3 -c "import site;print(site.getsitepackages())"
['/usr/local/lib64/python3.8/site-packages', '/usr/local/lib/python3.8/site-packages', '/usr/lib64/python3.8/site-packages', '/usr/lib/python3.8/site-packages']
Hmm .. that output tells me that I still did not removed /usr/loca/*
from site.getsitepackages
(but that is not relevant in this case :))
I can tell you that using that pytest I've manged build below number of packages:
[tkloczko@barrel SPECS]$ ls -1 python-*|wc -l; grep ^%pytest python-*|wc -l
652
600
I'm using more times pytest than in Fedora :P
[tkloczko@barrel SPECS.fedora]$ ls -1 python-*|wc -l; grep ^%pytest python-*|wc -l
2637
373
So I think that may be used as kind of prove that my pytest is ok (+/- eventual bugs in pytest themselves :)) From point of view of what I have installed what is listed in requirements.txt
[tkloczko@barrel SRPMS]$ pip freeze | egrep "setuptools|PyYAML|enum34|stdlib-list|coverage|pytest|pytest-cov|Sphinx"
coverage==6.0
nose2pytest==1.0.8
Pallets-Sphinx-Themes==2.0.1
pygments-pytest==2.2.0
pytest==6.2.5
pytest-aiohttp==0.3.0
pytest-asyncio==0.15.1
pytest-benchmark==3.4.1
pytest-black==0.3.12
pytest-cases==3.6.4
pytest-cov==2.12.1
pytest-datadir==1.3.1
pytest-expect==1.1.0
pytest-fixture-config==1.7.0
pytest-flake8==1.0.7
pytest-forked==1.3.0
pytest-freezegun==0.4.2
pytest-isort==2.0.0
pytest-localserver==0.5.0
pytest-mock==3.6.1
pytest-profiling==1.7.0
pytest-randomly==3.8.0
pytest-regressions==2.2.0
pytest-rerunfailures==9.1.1
pytest-runner==5.3.2
pytest-shutil==1.7.0
pytest-subtests==0.5.0
pytest-timeout==1.4.2
pytest-toolbox==0.5
pytest-tornado==0.8.1
pytest-tornasync==0.6.0.post2
pytest-trio==0.7.0
pytest-twisted==1.13.3
pytest-virtualenv==1.7.0
pytest-xdist==2.3.0
pytest-xprocess==0.18.1
pytest_check==1.0.1
PyYAML==5.4.1
setuptools-git==1.2
setuptools-rust==0.12.1
setuptools-scm==6.3.2
setuptools-scm-git-archive==1.1
Sphinx==4.2.0
stdlib-list==0.8.0
unittest2pytest==0.4
So I have no installed only enum34
but that is for python < 3.4:
[tkloczko@barrel pydeps-1.10.10]$ grep -r enum34
requirements.txt:enum34==1.0.4; python_version < "3.4"
setup.py: 'enum34; python_version < "3.4"',
From these two:
[tkloczko@barrel SPECS]$ python3 -c "import pytest;print(pytest.__file__)"
/usr/lib/python3.8/site-packages/pytest/__init__.py
[tkloczko@barrel SPECS]$ python3 -c "import site;print(site.getsitepackages())"
[..., '/usr/lib/python3.8/site-packages']
pydeps definitely ought to be able to find it.
Do you have a /usr/lib/python3.8/site-packages/pytest-6.2.5.dist-info
directory?
You can check what pydeps finds by running the pydeps/package_names.py
file as a standalone script. For reference here is the output after running pip install -r requirements.txt
in a new virtualenv (Python 3.5.4):
(pydeps) go|c:\srv\lib\code\pydeps> python pydeps/package_names.py
{
"Crypto": "pycrypto",
"_distutils_hack": "setuptools",
"_pytest": "pytest",
"alabaster": "alabaster",
"atomicwrites": "atomicwrites",
"attr": "attrs",
"babel": "Babel",
"certifi": "certifi",
"chardet": "chardet",
"colorama": "colorama",
"coverage": "coverage",
"dkconfig": "dkconfig",
"docutils": "docutils",
"easy_install": "setuptools",
"idna": "idna",
"imagesize": "imagesize",
"importlib_metadata": "importlib-metadata",
"iniconfig": "iniconfig",
"jinja2": "Jinja2",
"lockfile": "lockfile",
"markupsafe": "MarkupSafe",
"packaging": "packaging",
"pathlib2": "pathlib2",
"pip": "pip",
"pkg_resources": "setuptools",
"pluggy": "pluggy",
"py": "py",
"pygments": "Pygments",
"pyparsing": "pyparsing",
"pytest": "pytest",
"pytest_cov": "pytest-cov",
"pytz": "pytz",
"requests": "requests",
"setuptools": "setuptools",
"six": "six",
"snowballstemmer": "snowballstemmer",
"sphinx": "Sphinx",
"sphinxcontrib": "sphinxcontrib-serializinghtml",
"stdlib_list": "stdlib-list",
"toml": "toml",
"urllib3": "urllib3",
"wheel": "wheel",
"yaml": "PyYAML",
"zipp": "zipp"
}
Just to get a baseline...
If you create a new virtualenv, install the requirements.txt file, and run pytest.. does that work flawlessly? Using the windows version of virtualenvwrapper, that would be (linux should be identical if you have virtualenvwrapper installed):
(dev35) go|c:\srv\lib\code\pydeps> mkvirtualenv pydeps
created virtual environment CPython3.5.4.final.0-32 in 4413ms
creator CPython3Windows(dest=c:\srv\venv\pydeps, clear=False, global=False)
seeder FromAppData(download=False, setuptools=bundle, pip=bundle, wheel=bundle, via=copy, app_data_dir=C:\Users\xxx\AppData\Local\pypa\virtualenv)
added seed packages: pip==20.3.4, setuptools==50.3.2, wheel==0.36.2
activators XonshActivator,BashActivator,FishActivator,BatchActivator,PythonActivator,PowerShellActivator
(pydeps) go|c:\srv\lib\code\pydeps> pip install -r requirements.txt
...
(pydeps) go|c:\srv\lib\code\pydeps> pytest
================================================= test session starts =================================================
platform win32 -- Python 3.5.4, pytest-6.1.2, py-1.10.0, pluggy-0.13.1
rootdir: c:\srv\lib\code\pydeps
plugins: cov-2.12.1
collected 41 items
tests\test_cli.py .... [ 9%]
tests\test_colors.py ..... [ 21%]
tests\test_cycles.py . [ 24%]
tests\test_dep2dot.py . [ 26%]
tests\test_dot.py ..... [ 39%]
tests\test_externals.py . [ 41%]
tests\test_file.py ... [ 48%]
tests\test_funny_names.py . [ 51%]
tests\test_json.py . [ 53%]
tests\test_package_names.py . [ 56%]
tests\test_py2dep.py . [ 58%]
tests\test_relative_imports.py ....... [ 75%]
tests\test_render_context.py ... [ 82%]
tests\test_skinny_package.py . [ 85%]
tests\test_skip.py ...... [100%]
================================================= 41 passed in 13.19s =================================================
[..] pydeps definitely ought to be able to find it.
Do you have a
/usr/lib/python3.8/site-packages/pytest-6.2.5.dist-info
directory?[tkloczko@barrel SPECS]$ ls -la /usr/lib/python3.8/site-packages/pytest-6.2.5-py3.8.egg-info total 32 drwxr-xr-x 1 root root 184 Aug 30 21:03 . drwxr-xr-x. 1 root root 49656 Oct 5 02:00 .. -rw-r--r-- 1 root root 1 Aug 30 20:48 dependency_links.txt -rw-r--r-- 1 root root 78 Aug 30 20:48 entry_points.txt -rw-r--r-- 1 root root 1 Aug 30 20:48 not-zip-safe -rw-r--r-- 1 root root 6915 Aug 30 20:48 PKG-INFO -rw-r--r-- 1 root root 243 Aug 30 20:48 requires.txt -rw-r--r-- 1 root root 2019 Aug 30 20:48 SOURCES.txt -rw-r--r-- 1 root root 15 Aug 30 20:48 top_level.txt
You can check what pydeps finds by running the
pydeps/package_names.py
file as a standalone script. For reference here is the output after runningpip install -r requirements.txt
in a new virtualenv (Python 3.5.4):[tkloczko@barrel pydeps-1.10.10]$ python3 pydeps/package_names.py ERR: sphinxcontrib_github_alt-1.2.dist-info has not top_level.txt ERR: sphinx_inline_tabs-2021.4.11b9.dist-info has not top_level.txt ERR: flit_core-3.3.0.dist-info has not top_level.txt ERR: flit-3.3.0.dist-info has not top_level.txt ERR: threadpoolctl-2.2.0.dist-info has not top_level.txt ERR: tomli-1.2.1.dist-info has not top_level.txt ERR: aiofiles-0.7.0.dist-info has not top_level.txt ERR: pastel-0.2.1.dist-info has not top_level.txt ERR: crashtest-0.3.1.dist-info has not top_level.txt ERR: clikit-0.6.2.dist-info has not top_level.txt ERR: cleo-0.8.1.dist-info has not top_level.txt ERR: cachy-0.3.0.dist-info has not top_level.txt ERR: installer-0.2.2.dist-info has not top_level.txt ERR: rich-10.11.0.dist-info has not top_level.txt ERR: cssselect2-0.4.1.dist-info has not top_level.txt ERR: poetry_core-1.0.7.dist-info has not top_level.txt ERR: PyQt5_sip-4.19.22.dist-info has not top_level.txt ERR: PyQt5-5.14.2.dist-info has not top_level.txt { "Crypto": "pycrypto", "_distutils_hack": "setuptools", "pip": "pip", "pkg_resources": "setuptools", "pyparsing": "pyparsing", "setuptools": "setuptools", "yaml": "PyYAML" }
However ..
[tkloczko@barrel pydeps-1.10.10]$ pip freeze | egrep -w "sphinxcontrib_github_alt|sphinx_inline_tabs|flit_core|flit|threadpoolctl|tomli|aiofiles|pastel|crashtest|clikit|cleo|cachy|installer|rich|cssselect2|poetry_core|PyQt5_sip|PyQt5" aiofiles==0.7.0 cachy==0.3.0 cleo==0.8.1 clikit==0.6.2 crashtest==0.3.1 cssselect2==0.4.1 flit==3.3.0 flit_core==3.3.0 installer==0.2.2 pastel==0.2.1 PyQt5==5.14.2 PyQt5_sip==4.19.22 rich==10.11.0 sphinx_inline_tabs @ file:///home/tkloczko/rpmbuild/BUILD/sphinx-inline-tabs-2021.04.11.beta9/dist/sphinx_inline_tabs-2021.4.11b9-py3-none-any.whl sphinxcontrib_github_alt==1.2 threadpoolctl==2.2.0 tomli==1.2.1
(pydeps) go|c:\srv\lib\code\pydeps> python pydeps/package_names.py { "Crypto": "pycrypto", "_distutils_hack": "setuptools", "_pytest": "pytest", "alabaster": "alabaster", "atomicwrites": "atomicwrites", "attr": "attrs", "babel": "Babel", "certifi": "certifi", "chardet": "chardet", "colorama": "colorama", "coverage": "coverage", "dkconfig": "dkconfig", "docutils": "docutils", "easy_install": "setuptools", "idna": "idna", "imagesize": "imagesize", "importlib_metadata": "importlib-metadata", "iniconfig": "iniconfig", "jinja2": "Jinja2", "lockfile": "lockfile", "markupsafe": "MarkupSafe", "packaging": "packaging", "pathlib2": "pathlib2", "pip": "pip", "pkg_resources": "setuptools", "pluggy": "pluggy", "py": "py", "pygments": "Pygments", "pyparsing": "pyparsing", "pytest": "pytest", "pytest_cov": "pytest-cov", "pytz": "pytz", "requests": "requests", "setuptools": "setuptools", "six": "six", "snowballstemmer": "snowballstemmer", "sphinx": "Sphinx", "sphinxcontrib": "sphinxcontrib-serializinghtml", "stdlib_list": "stdlib-list", "toml": "toml", "urllib3": "urllib3", "wheel": "wheel", "yaml": "PyYAML", "zipp": "zipp" }
Checking module availability by checking .dist-info metadata seems is a bit overkill.
Just to get a baseline...
If you create a new virtualenv, install the requirements.txt file, and run pytest.. does that work flawlessly? Using the windows version of virtualenvwrapper, that would be (linux should be identical if you have virtualenvwrapper installed):
So in other words that assertion is failing only because you are expectiong that all those required modules will be installed with .dist-info
metadata for every required module. Am I right?
Checking module availability by checking .dist-info metadata seems is a bit overkill.
That would be true ;-) I'm using it to find the package name from module name however (e.g. you import yaml
but pip install PyYAML
).
This part though:
Do you have a /usr/lib/python3.8/site-packages/pytest-6.2.5.dist-info directory?
[tkloczko@barrel SPECS]$ ls -la /usr/lib/python3.8/site-packages/pytest-6.2.5-py3.8.egg-info ...
Led me to https://github.com/pypa/pip/issues/4611 - iow. python setup.py install
creates an .egg-info directory, while pip install ...
creates a .dist-info directory. Pydeps only looks for .dist-info.
So in other words that assertion is failing only because you are expectiong that all those required modules will be installed with
.dist-info
metadata for every required module. Am I right?
Yes, you are - in particular pydeps is here trying to test that it can find a package it knows is installed (i.e. the module currently running the test when the assertion happens).
I haven't done a python setup.py install
in years so I didn't even think about .egg-info directories. It's definitely a bug in pydeps - give me a little bit and I'll push out a fixed version.
(ps: I'm liking this test even more now ;-) )
Pydeps 1.10.11 is on PyPI which should fix the issue.
Just tested that.
+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-pydeps-1.10.11-2.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-pydeps-1.10.11-2.fc35.x86_64/usr/lib/python3.8/site-packages
+ /usr/bin/pytest -ra
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.12, pytest-6.2.5, py-1.10.0, pluggy-0.13.1
benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
Using --randomly-seed=3235155579
rootdir: /home/tkloczko/rpmbuild/BUILD/pydeps-1.10.11
plugins: forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, aspectlib-1.5.2, toolbox-0.5, rerunfailures-9.1.1, requests-mock-1.9.3, cov-2.12.1, flaky-3.7.0, benchmark-3.4.1, xdist-2.3.0, pylama-7.7.1, datadir-1.3.1, regressions-2.2.0, xprocess-0.18.1, black-0.3.12, asyncio-0.15.1, subtests-0.5.0, isort-2.0.0, hypothesis-6.14.6, mock-3.6.1, profiling-1.7.0, randomly-3.8.0, nose2pytest-1.0.8, pyfakefs-4.5.1, tornado-0.8.1, twisted-1.13.3, aiohttp-0.3.0, localserver-0.5.0, anyio-3.3.1, trio-0.7.0, cases-3.6.4, yagot-0.5.0, Faker-8.16.0
collected 41 items
tests/test_skip.py ...... [ 14%]
tests/test_package_names.py . [ 17%]
tests/test_relative_imports.py ....... [ 34%]
tests/test_skinny_package.py . [ 36%]
tests/test_funny_names.py . [ 39%]
tests/test_json.py . [ 41%]
tests/test_render_context.py ... [ 48%]
tests/test_file.py ... [ 56%]
tests/test_dot.py ..... [ 68%]
tests/test_py2dep.py . [ 70%]
tests/test_externals.py . [ 73%]
tests/test_colors.py ..... [ 85%]
tests/test_cycles.py . [ 87%]
tests/test_cli.py .... [ 97%]
tests/test_dep2dot.py . [100%]
=========================================================================== 41 passed in 11.10s ============================================================================
pytest-xprocess reminder::Be sure to terminate the started process by running 'pytest --xkill' if you have not explicitly done so in your fixture with 'xprocess.getinfo(<process_name>).terminate()'.
+ RPM_EC=0
Thank you :)
BTW .. I think that tests/test_package_names.py should be removed and proper build dependencies should be only declared in project.toml/setup.cfg :)
Surely you mean pyproject.toml :-D -- I'll never understand why they chose such an ugly and backwards file format that doesn't even have support in core Python.. sigh. It'll mean adding two files in the root of the project, and there is a small forrest of config files there already - I'd like fewer root files, not more. ...and from what I can see it doesn't buy me anything...
Sorry for the rant, Python packaging is a continual clusterfsck.
To your comment: pytest is not a build dependency, it is a test dependency ;-) Also, test_package_names.py does not test dependencies, it tests the ability to find installed packages. Pydeps is after all, a tool to find and display dependencies ;-)
It also seems to be a good test, since its bugkill-number is pretty high...
Your rant is probably is nothing compare with mine after packaging +650 python modules :P
Checking build-time dependencies should be immanent part of the python build/maintainace tooling. Because that part is optional situation in many cases is really messy. There is no as well any mechanism to confirm that declared dependences are to narrow or to wide :/ (and that part IMO should be quite easy to verify only by checking all import
lines nad checking graph of loaded modules dependencies).
..after packaging +650 python modules
Ouch.
... and that part IMO should be quite easy to verify only by checking all import lines ...
The fact that there isn't an easy way to do so, is a large reason why pydeps exists. I'll be overjoyed if Python some day can do this itself, since even pydeps can't find all imports. Pydeps does a scan of import-opcodes in the .pyc files, which means it will never find especially dynamic imports, e.g.:
# funky.py
import yaml
def funky():
import alabaster
def funkier():
import imp
fp, path, descr = imp.find_module('apipkg')
imp.load_module('apipkg', fp, path, descr)
pydeps (v1.10.12 - I had to fix a couple of bugs) can do (assuming pip install PyYAML alabaster apipkg
):
$> pydeps --externals funky.py
[
"alabaster",
"yaml"
]
(no apipkg
since the imp.load_module
doesn't generate a bytecode).
With some fancy options this can be turned into
$> pydeps --show-deps --noshow funky.py | python -m pydeps.tools.pydeps2requirements
alabaster # from: funky
PyYAML # from: funky
(notice the PyYAML instead of yaml in the last line ;-) )
pydeps2requirements is extracted from an internal tool that does a bit more (e.g. correctly orders all -e ../pkgname
lines in the requirements.txt file - which turns out to be surprisingly important..)
Hmm .. initially I've packaged pydeps
only because other modules dependencies however I just realised that I can probably try to use your module to intercept actual dependencies 😋 or at least verify what I have in BuildRequires.
Do you have maybe some suggestions about how can I try to generate just packaged build, install and test suite dependencies?🤔
Especially something which would help keep updated test suite dependencies would be very helpful.
603
654
As you see in test suites +85% of all cases I'm using pytest.
That seems like a moderately to hard problem. You can get all modules (directly) imported in the tests by (--externals
: produce a json list with all modules directly imported, --include-missing
: include modules that are not installed).
/work/dk-tasklib $> pydeps --externals --include-missing tests
[
"dktasklib",
"invoke",
"pytest",
"yamldirs"
]
that doesn't show transitive dependencies.
You can get the output of the dependency analysis as json by:
(--no-show
: don't create/show the graph, --max-bacon
: how many hops to follow, 0=infinite, 2=default)
/work/dk-tasklib $> pydeps --no-show --show-deps --max-bacon=0 tests
{
...
"dktasklib.version": {
"bacon": 2,
"imported_by": [
"dktasklib.entry_points.taskbase",
"dktasklib.jstools",
"dktasklib.lessc",
"tests.test_dktasklib_import",
"tests.test_version"
],
"imports": [
"dktasklib",
"dktasklib.concat",
"dktasklib.package",
"dktasklib.wintask",
"invoke"
],
"name": "dktasklib.version",
"path": "c:\\srv\\lib\\dk-tasklib\\dktasklib\\version.py"
},
...
I'm not sure how that will help you though...
For our packages, each is tested by essentially:
mkvirtualenv my-package
git clone https://github.com/.../my-package
cd my-package
pip install -U wheel packaging twine
pip install -r requirements.txt
pip install -e .
pytest tests
python setup.py sdist bdist_wheel
twine upload dist/*
which gives each package the opportunity to try out new versions of shared packages.
We have a global-requirements.txt file that pins exact versions of all common external packages (and their transitive dependencies), and a tool that updates all local (i.e. our own) packages' requirements.txt to use the versions pinned in the global-requirements.txt file.
On the production server we download all external wheels to a wheelhouse directory with our getwheel.sh (i.e. ./getwheel.sh -r global-requirements.txt
):
#!/bin/bash
tmp_dir=$(mktemp -d -t wh-XXXXXXXX)
pip wheel --wheel-dir=${tmp_dir} $*
cp ${tmp_dir}/* /srv/wheelhouse
rm -rf $tmp_dir
If we've forgotten to pin a transitive dependency this will potentially give us wheels for multiple versions of that dependency.
We install the global-requirements.txt before any of our packages, only from previously downloaded wheels and without any dependencies:
pip install -r global-requirements.txt --upgrade --no-deps --no-index --find-links=file:///srv/wheelhouse
and our packages are installed similarly
pip install my-package --upgrade --no-deps --no-index --find-links=file:///srv/wheelhouse
The idea is that no package install will upgrade a dependency that another package depends on (i.e. a manual solution to a dll-hell type problem).
Updating a package in global-requirement.txt, and its up/down dependencies, is a very manual and delicate process (we currently have 202 external and 146 local packages).
Our CI runs on docker images that have global-requirements.txt pre-installed, which give us some confidence that we don't break production on a deploy.
Production runs in a virtualenv so all our careful preparations aren't flushed by a careless sysadm doing a yum install...
If I've understood you correctly, you're wanting a way to create a version of our global-requirements.txt that will work for system packages..? I have, unfortunately, not found a way to do that. We usually end up searching upgrade documentation for a package, or looking at the tests, to figure out which versions of transitive dependencies it has been tested with, trying to find a combination that will work for all 202 external packages. This is especially fun for packages (like django and django-cms) that have a very loose definition of "semver" and "backwards compatible".
Just retested 1.10.12 and looks like pytest is no longer failing 😄 TThank you and closing.
I'm trying to package your module as an rpm package. So I'm using the typical build, install and test cycle used on building packages from non-root account.