PXMYH / helios

Real Estate Rental Info Data Collector
https://beast-helios.herokuapp.com/
MIT License
2 stars 2 forks source link

Bump scrapy from 1.6.0 to 1.7.2 #66

Closed dependabot-preview[bot] closed 5 years ago

dependabot-preview[bot] commented 5 years ago

Bumps scrapy from 1.6.0 to 1.7.2.

Release notes *Sourced from [scrapy's releases](https://github.com/scrapy/scrapy/releases).* > ## 1.7.2 > Fix Python 2 support ([#3889](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3889), [#3893](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3893), [#3896](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3896)) > > ## 1.7.0 > Highlights: > > * Improvements for crawls targeting multiple domains > * A cleaner way to pass arguments to callbacks > * A new class for JSON requests > * Improvements for rule-based spiders > * New features for feed exports > > [See the full change log](https://docs.scrapy.org/en/1.7/news.html)
Changelog *Sourced from [scrapy's changelog](https://github.com/scrapy/scrapy/blob/master/docs/news.rst).* > Scrapy 1.7.2 (2019-07-23) > ========================= > > Fix Python 2 support (3889, 3893, 3896). > > Scrapy 1.7.1 (2019-07-18) > ========================= > > Re-packaging of Scrapy 1.7.0, which was missing some changes in PyPI. > > Scrapy 1.7.0 (2019-07-18) > ========================= > >
> > Make sure you install Scrapy 1.7.1. The Scrapy 1.7.0 package in PyPI > > : is the result of an erroneous commit tagging and does not include all the changes described below. > >
> > Highlights: > > - Improvements for crawls targeting multiple domains > - A cleaner way to pass arguments to callbacks > - A new class for JSON requests > - Improvements for rule-based spiders > - New features for feed exports > > Backward-incompatible changes > ----------------------------- > > - `429` is now part of the RETRY\_HTTP\_CODES setting by default > > This change is **backward incompatible**. If you don’t want to retry `429`, you must override RETRY\_HTTP\_CODES accordingly. > > - \~scrapy.crawler.Crawler, CrawlerRunner.crawl <scrapy.crawler.CrawlerRunner.crawl> and CrawlerRunner.create\_crawler <scrapy.crawler.CrawlerRunner.create\_crawler> no longer accept a \~scrapy.spiders.Spider subclass instance, they only accept a \~scrapy.spiders.Spider subclass now. > > \~scrapy.spiders.Spider subclass instances were never meant to work, and they were not working as one would expect: instead of using the passed \~scrapy.spiders.Spider subclass instance, their \~scrapy.spiders.Spider.from\_crawler method was called to generate a new instance. > > - Non-default values for the SCHEDULER\_PRIORITY\_QUEUE setting may stop working. Scheduler priority queue classes now need to handle \~scrapy.http.Request objects instead of arbitrary Python data structures. > > See also 1.7-deprecation-removals below. > > New features > ------------ > > - A new scheduler priority queue, scrapy.pqueues.DownloaderAwarePriorityQueue, may be enabled <broad-crawls-scheduler-priority-queue> for a significant scheduling improvement on crawls targetting multiple web domains, at the cost of no CONCURRENT\_REQUESTS\_PER\_IP support (3520) > - A new Request.cb\_kwargs <scrapy.http.Request.cb\_kwargs> attribute provides a cleaner way to pass keyword arguments to callback methods (1138, 3563) > - A new \~scrapy.http.JSONRequest class offers a more convenient way to build JSON requests (3504, 3505) > ... (truncated)
Commits - [`1a289c1`](https://github.com/scrapy/scrapy/commit/1a289c15f8deb416efdb5f72742ec5416d751d98) Bump version: 1.7.1 → 1.7.2 - [`d0288da`](https://github.com/scrapy/scrapy/commit/d0288da221779a27f0e63f3f6c3c85ad488687ee) Merge pull request [#3898](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3898) from Gallaecio/1.7 - [`77beab3`](https://github.com/scrapy/scrapy/commit/77beab37968b2347f1e06fa63ed8881681330b80) Cover Scrapy 1.7.2 in the release notes - [`c51cbcf`](https://github.com/scrapy/scrapy/commit/c51cbcfe813a655f47da30af2c760a32991e02d3) Fix ConfigParser import in py2 - [`079164c`](https://github.com/scrapy/scrapy/commit/079164c1b95b07a3923048e83eef79c58e184f8e) Cover Scrapy 1.7.1 in the release notes - [`951bc96`](https://github.com/scrapy/scrapy/commit/951bc96f5d90195b87752840be871994caa5ab64) Bump version: 1.7.0 → 1.7.1 - [`ae4eab9`](https://github.com/scrapy/scrapy/commit/ae4eab9843752e7cf75420a5d4f4fa58f8da8e50) Cover the 1.7.1 PyPI repackaging in the release notes - [`4e23d70`](https://github.com/scrapy/scrapy/commit/4e23d70dd34c180d19c4751265005cea6da43927) Bump version: 1.6.0 → 1.7.0 - [`a94b5be`](https://github.com/scrapy/scrapy/commit/a94b5bef3a6ae658ec58a9a17bd149453aa855a1) Write the 1.7 release notes and cover dropping Python 2 support in the upcomi... - [`44eb21a`](https://github.com/scrapy/scrapy/commit/44eb21aa51cc0212e68dc0709aaa5c10bfe64e7e) Merge pull request [#3882](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3882) from MagdalenaDeschner/master - Additional commits viewable in [compare view](https://github.com/scrapy/scrapy/compare/1.6.0...1.7.2)


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot ignore this [patch|minor|major] version` will close this PR and stop Dependabot creating any more for this minor/major version (unless you reopen the PR or upgrade to it). To ignore the version in this PR you can just close it - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a "Dependabot enabled" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired) Finally, you can contact us by mentioning @dependabot.
codecov[bot] commented 5 years ago

Codecov Report

Merging #66 into master will not change coverage. The diff coverage is n/a.

Impacted file tree graph

@@          Coverage Diff           @@
##           master     #66   +/-   ##
======================================
  Coverage    2.38%   2.38%           
======================================
  Files          11      11           
  Lines         294     294           
======================================
  Hits            7       7           
  Misses        287     287

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 306f4c7...ed5e16c. Read the comment docs.

dependabot-preview[bot] commented 5 years ago

Superseded by #72.