scrapy / scrapyd

A service daemon to run Scrapy spiders
https://scrapyd.readthedocs.io/en/stable/
BSD 3-Clause "New" or "Revised" License
2.96k stars 569 forks source link

Fix tests and CI #398

Closed Digenis closed 3 years ago

Digenis commented 3 years ago

Some problems have accumulated over the past 2 years.

Cryptography needs rust to build. We won't build it but use wheels in the CI. I tried installing the latest pip and wheel but tox still tries to install source packages. I tried without tox and it works. I don't know why it doesn't use wheels in the CI environment.

Deprecation warnings from subprocesses are getting mixed up with a traceback that I wanted to parse in a test. I use mock to patch(wrap) Popen in this test to pass an additional argument to python to disable warnings.

There was lot of deprecated code (py2.6) in the tests.

There were leftovers for some 2014 ubuntu env to cleanup.

Python3.4 support seems problematic, I'm dropping it, it's not like we are dropping 2 consecutive releases and we never supported 3.3 anyway. I'm also dropping pypy2, because of the cryptography issue.

I also add 3.7, 3.8 and 3.9 to see what happens with the build.

Digenis commented 3 years ago

There's one last problem with the package data in python2. Maybe it has something to do with the unittest's runner.

Digenis commented 3 years ago

pkgutil.get_data uses scrapyd's __file__ attribute to build the path but unittest discover mode's uses some temporary directory and this triggers https://bugs.python.org/issue18416 That's why it works in py3.4+

But that's not all. When setting the top-level-directory or removing the start-directory in unittest's runner the problem disappears although the tests are still being ran from a temp directory (though not that deeply nested).

I don't know why this fixes it in py2.7 but it seems sensible to just drop both flags and let unittest's runner discover tests on its own.

Digenis commented 3 years ago

that fixed it

I'll add doc and fixup 15474a2 on c02cda3

I don't want to drop tox, maybe I'll add it back to the CI in another attempt.

codecov[bot] commented 3 years ago

Codecov Report

Merging #398 (453f881) into master (788ee1b) will increase coverage by 0.68%. The diff coverage is n/a.

:exclamation: Current head 453f881 differs from pull request most recent head 63aed40. Consider uploading reports for the commit 63aed40 to get more accurate results Impacted file tree graph

@@            Coverage Diff             @@
##           master     #398      +/-   ##
==========================================
+ Coverage   67.98%   68.66%   +0.68%     
==========================================
  Files          17       17              
  Lines         862      868       +6     
  Branches      104      104              
==========================================
+ Hits          586      596      +10     
+ Misses        246      241       -5     
- Partials       30       31       +1     
Impacted Files Coverage Δ
scrapyd/poller.py 86.66% <0.00%> (+0.45%) :arrow_up:
scrapyd/website.py 61.22% <0.00%> (+1.65%) :arrow_up:
scrapyd/sqlite.py 88.88% <0.00%> (+1.85%) :arrow_up:
scrapyd/utils.py 77.11% <0.00%> (+1.90%) :arrow_up:

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 788ee1b...63aed40. Read the comment docs.

Digenis commented 3 years ago

Last tests confirm that the source-package preference is not a problem of trial. Maybe tox is not properly configured.

Squashing all unittest and coverage commits on c02cda3

Digenis commented 3 years ago

updated changelog

droped the setuptools hard requirement commit I'll need to read up on it before such a decision

Digenis commented 3 years ago

@my8100, any comments?

my8100 commented 3 years ago

@Digenis Great job! I use pytest and Circle CI in my own projects, not that familiar with tox and travis CI. I’m considering creating a project to test any Scrapyd PR, with Scrapyd as a service and requests as a client. In that way, we can easily test all pages and API provided.

Digenis commented 3 years ago

I'm satisfied with tox. There seems to be some problem with tox in travis preventing it from downloading binary wheels. Or it's some cache.

Anyway, merging so that other pull requests can go on.

my8100 commented 3 years ago

Cryptography needs rust to build. How I fix the rust requirement when installing cryptography on Python 3.6:


copying src/cryptography/py.typed -> build/lib.linux-x86_64-3.6/cryptography
running build_ext
generating cffi module 'build/temp.linux-x86_64-3.6/_padding.c'
creating build/temp.linux-x86_64-3.6
generating cffi module 'build/temp.linux-x86_64-3.6/_openssl.c'
running build_rust
    =============================DEBUG ASSISTANCE=============================
    If you are seeing a compilation error please try the following steps to
    successfully install cryptography:
    1) Upgrade to the latest pip and try again. This will fix errors for most
       users. See: https://pip.pypa.io/en/stable/installing/#upgrading-pip
    2) Read https://cryptography.io/en/latest/installation.html for specific
       instructions for your platform.
    3) Check our frequently asked questions for more information:
       https://cryptography.io/en/latest/faq.html
    4) Ensure you have a recent Rust toolchain installed:
       https://cryptography.io/en/latest/installation.html#rust
    5) If you are experiencing issues with Rust for *this release only* you may
       set the environment variable `CRYPTOGRAPHY_DONT_BUILD_RUST=1`.
    =============================DEBUG ASSISTANCE=============================

error: can't find Rust compiler

If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler.

To update pip, run:

    pip install --upgrade pip

and then retry package installation.

If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain.

This package requires Rust >=1.41.0.

----------------------------------------

sudo apt-get update && sudo apt-get upgrade && sudo apt-get install rustc

Setting up rust-gdb (1.41.1+dfsg1-1~deb10u1) ... Installing collected packages: attrs, zope.interface, incremental, six, Automat, constantly, idna, hyperlink, Twisted, pyasn1, pycparser, cffi, cryptography, pyasn1-modules, service-identity, w3lib, hyperframe, hpack, h2, queuelib, lxml, cssselect, parsel, pyOpenSSL, protego, itemadapter, PyDispatcher, jmespath, itemloaders, Scrapy, enum-compat, pip, coverage, pyparsing, packaging, zipp, typing-extensions, importlib-metadata, pluggy, iniconfig, py, toml, pytest, chardet, certifi, urllib3, requests Running setup.py install for cryptography ... done Running setup.py install for protego ... done Running setup.py install for PyDispatcher ... done Found existing installation: pip 18.1 Uninstalling pip-18.1: Successfully uninstalled pip-18.1 Successfully installed Automat-20.2.0 PyDispatcher-2.0.5 Scrapy-2.5.0 Twisted-21.2.0 attrs-20.3.0 certifi-2020.12.5 cffi-1.14.5 chardet-4.0.0 constantly-15.1.0 coverage-5.5 cryptography-3.4.7 cssselect-1.1.0 enum-compat-0.0.3 h2-3.2.0 hpack-3.0.0 hyperframe-5.2.0 hyperlink-21.0.0 idna-2.10 importlib-metadata-3.10.1 incremental-21.3.0 iniconfig-1.1.1 itemadapter-0.2.0 itemloaders-1.0.4 jmespath-0.10.0 lxml-4.6.3 packaging-20.9 parsel-1.6.0 pip-21.0.1 pluggy-0.13.1 protego-0.1.16 py-1.10.0 pyOpenSSL-20.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.20 pyparsing-2.4.7 pytest-6.2.3 queuelib-1.5.0 requests-2.25.1 service-identity-18.1.0 six-1.15.0 toml-0.10.2 typing-extensions-3.7.4.3 urllib3-1.26.4 w3lib-1.22.0 zipp-3.4.1 zope.interface-5.4.0

my8100 commented 3 years ago

@Digenis Great job! I use pytest and Circle CI in my own projects, not that familiar with tox and travis CI. I’m considering creating a project to test any Scrapyd PR, with Scrapyd as a service and requests as a client. In that way, we can easily test all pages and API provided.

Implemented in #399

Digenis commented 3 years ago

The rust problem was solved by using binary wheels.
We should have been using binary wheels in the CI anyway.
pip preferred source packages only when using the CI with tox
but I will surely not blame tox for this.

I only took a quick look to the other PR,
I don't know why we should switch to circle CI
but I see you added end-to-end tests for http auth
which is definitely worth considering.

my8100 commented 3 years ago

I don't know why we should switch to circle CI but I see you added end-to-end tests for http auth which is definitely worth considering.

It’s better to enable both unit test and end-to-end test at the same time to ensure the code quality.

my8100 commented 3 years ago

No need to use Circle CI, see #400