klane / databall

Betting on the NBA with data
https://klane.github.io/databall/
MIT License
134 stars 25 forks source link

Bump scrapy from 2.3.0 to 2.4.0 #446

Closed dependabot-preview[bot] closed 4 years ago

dependabot-preview[bot] commented 4 years ago

Bumps scrapy from 2.3.0 to 2.4.0.

Release notes

Sourced from scrapy's releases.

2.4.0

Hihglights:

  • Python 3.5 support has been dropped.

  • The file_path method of media pipelines can now access the source item.

    This allows you to set a download file path based on item data.

  • The new item_export_kwargs key of the FEEDS setting allows to define keyword parameters to pass to item exporter classes.

  • You can now choose whether feed exports overwrite or append to the output file.

    For example, when using the crawl or runspider commands, you can use the -O option instead of -o to overwrite the output file.

  • Zstd-compressed responses are now supported if zstandard is installed.

  • In settings, where the import path of a class is required, it is now possible to pass a class object instead.

See the full changelog

Changelog

Sourced from scrapy's changelog.

Scrapy 2.4.0 (2020-10-11)

Highlights:

  • Python 3.5 support has been dropped.

  • The file_path method of media pipelines <topics-media-pipeline> can now access the source item <topics-items>.

    This allows you to set a download file path based on item data.

  • The new item_export_kwargs key of the FEEDS setting allows to define keyword parameters to pass to item exporter classes <topics-exporters>

  • You can now choose whether feed exports <topics-feed-exports> overwrite or append to the output file.

    For example, when using the crawl or runspider commands, you can use the -O option instead of -o to overwrite the output file.

  • Zstd-compressed responses are now supported if zstandard is installed.

  • In settings, where the import path of a class is required, it is now possible to pass a class object instead.

Modified requirements

  • Python 3.6 or greater is now required; support for Python 3.5 has been dropped

    As a result:

    • When using PyPy, PyPy 7.2.0 or greater is now required <faq-python-versions>
    • For Amazon S3 storage support in feed exports <topics-feed-storage-s3> or media pipelines <media-pipelines-s3>, botocore 1.4.87 or greater is now required
    • To use the images pipeline <images-pipeline>, Pillow 4.0.0 or greater is now required

    (4718, 4732, 4733, 4742, 4743, 4764)

Backward-incompatible changes

  • ~scrapy.downloadermiddlewares.cookies.CookiesMiddleware once again discards cookies defined in Request.headers <scrapy.http.Request.headers>.

    We decided to revert this bug fix, introduced in Scrapy 2.2.0, because it was reported that the current implementation could break existing code.

    If you need to set cookies for a request, use the Request.cookies <scrapy.http.Request> parameter.

    A future version of Scrapy will include a new, better implementation of the reverted bug fix.

    (4717, 4823)

Commits
  • c340e72 Bump version: 2.3.0 → 2.4.0
  • 47eac83 Set a release date for Scrapy 2.4.0
  • 015c82b Scrapy 2.4 release notes (#4808)
  • da426fb Merge pull request #4839 from elacuesta/pytest_xfail_strict
  • 13ae17a Add xfail_strict=true to pytest.ini
  • 9f8c393 Merge pull request #4823 from elacuesta/cookies-revert-header
  • 45c06cf Merge pull request #4831 from starrify/downloadermw-support-zstd
  • 8fc4e2e Merge pull request #4836 from OfirD1/patch-1
  • ded9a5a Merge pull request #4835 from Gallaecio/about-url-support
  • 1a597d5 moved the sentence about processing pending requests when a spider is closed ...
  • Additional commits viewable in compare view


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.

If all status checks pass Dependabot will automatically merge this pull request.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot badge me` will comment on this PR with code to add a "Dependabot enabled" badge to your readme Additionally, you can set the following in the `.dependabot/config.yml` file in this repo: - Update frequency - Automerge options (never/patch/minor, and dev/runtime dependencies) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)