PXMYH / helios

Real Estate Rental Info Data Collector
https://beast-helios.herokuapp.com/
MIT License
2 stars 2 forks source link

Bump scrapy from 1.6.0 to 1.7.1 #64

Closed dependabot-preview[bot] closed 5 years ago

dependabot-preview[bot] commented 5 years ago

Bumps scrapy from 1.6.0 to 1.7.1.

Release notes *Sourced from [scrapy's releases](https://github.com/scrapy/scrapy/releases).* > ## 1.7.0 > Highlights: > > * Improvements for crawls targeting multiple domains > * A cleaner way to pass arguments to callbacks > * A new class for JSON requests > * Improvements for rule-based spiders > * New features for feed exports > > [See the full change log](https://docs.scrapy.org/en/1.7/news.html)
Changelog *Sourced from [scrapy's changelog](https://github.com/scrapy/scrapy/blob/master/docs/news.rst).* > Release notes > ============= > >
> > Scrapy 1.x will be the last series supporting Python 2. Scrapy 2.0, > > : planned for Q4 2019 or Q1 2020, will support **Python 3 only**. > >
> > Scrapy 1.7.0 (2019-07-18) > ------------------------- > >
> > Make sure you install Scrapy 1.7.1. The Scrapy 1.7.0 package in PyPI > > : is the result of an erroneous commit tagging and does not include all the changes described below. > >
> > Highlights: > > - Improvements for crawls targeting multiple domains > - A cleaner way to pass arguments to callbacks > - A new class for JSON requests > - Improvements for rule-based spiders > - New features for feed exports > > ### Backward-incompatible changes > > - `429` is now part of the RETRY\_HTTP\_CODES setting by default > > This change is **backward incompatible**. If you don’t want to retry `429`, you must override RETRY\_HTTP\_CODES accordingly. > > - \~scrapy.crawler.Crawler, CrawlerRunner.crawl <scrapy.crawler.CrawlerRunner.crawl> and CrawlerRunner.create\_crawler <scrapy.crawler.CrawlerRunner.create\_crawler> no longer accept a \~scrapy.spiders.Spider subclass instance, they only accept a \~scrapy.spiders.Spider subclass now. > > \~scrapy.spiders.Spider subclass instances were never meant to work, and they were not working as one would expect: instead of using the passed \~scrapy.spiders.Spider subclass instance, their \~scrapy.spiders.Spider.from\_crawler method was called to generate a new instance. > > - Non-default values for the SCHEDULER\_PRIORITY\_QUEUE setting may stop working. Scheduler priority queue classes now need to handle \~scrapy.http.Request objects instead of arbitrary Python data structures. > > See also 1.7-deprecation-removals below. > > ### New features > > - A new scheduler priority queue, scrapy.pqueues.DownloaderAwarePriorityQueue, may be enabled <broad-crawls-scheduler-priority-queue> for a significant scheduling improvement on crawls targetting multiple web domains, at the cost of no CONCURRENT\_REQUESTS\_PER\_IP support (3520) > - A new Request.cb\_kwargs <scrapy.http.Request.cb\_kwargs> attribute provides a cleaner way to pass keyword arguments to callback methods (1138, 3563) > - A new \~scrapy.http.JSONRequest class offers a more convenient way to build JSON requests (3504, 3505) > - A `process_request` callback passed to the \~scrapy.spiders.Rule constructor now receives the \~scrapy.http.Response object that originated the request as its second argument (3682) > ... (truncated)
Commits - [`951bc96`](https://github.com/scrapy/scrapy/commit/951bc96f5d90195b87752840be871994caa5ab64) Bump version: 1.7.0 → 1.7.1 - [`ae4eab9`](https://github.com/scrapy/scrapy/commit/ae4eab9843752e7cf75420a5d4f4fa58f8da8e50) Cover the 1.7.1 PyPI repackaging in the release notes - [`4e23d70`](https://github.com/scrapy/scrapy/commit/4e23d70dd34c180d19c4751265005cea6da43927) Bump version: 1.6.0 → 1.7.0 - [`a94b5be`](https://github.com/scrapy/scrapy/commit/a94b5bef3a6ae658ec58a9a17bd149453aa855a1) Write the 1.7 release notes and cover dropping Python 2 support in the upcomi... - [`44eb21a`](https://github.com/scrapy/scrapy/commit/44eb21aa51cc0212e68dc0709aaa5c10bfe64e7e) Merge pull request [#3882](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3882) from MagdalenaDeschner/master - [`c44d49b`](https://github.com/scrapy/scrapy/commit/c44d49b238f1c6cfc07ffa2fbb65b267e19e381c) minor PEP8 style changes - [`0d51f9c`](https://github.com/scrapy/scrapy/commit/0d51f9cc276fc6acaa815ee6d8af4e770a3ef9cf) [MRG+1] Wrong value of log_count/INFO in stats ([#3643](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3643)) - [`b2c013f`](https://github.com/scrapy/scrapy/commit/b2c013feca7a46aab48590a113ededda353c8d9d) Merge pull request [#3878](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3878) from elacuesta/mergedict_to_chainmap - [`6660020`](https://github.com/scrapy/scrapy/commit/6660020ebb01c6eb240f56347d67106f308dd333) remove detailed description about individual settings - [`d7074d8`](https://github.com/scrapy/scrapy/commit/d7074d86d26c936c6907dea7c550a4f251667d8b) Change condition to raise deprecation warning - Additional commits viewable in [compare view](https://github.com/scrapy/scrapy/compare/1.6.0...1.7.1)


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot ignore this [patch|minor|major] version` will close this PR and stop Dependabot creating any more for this minor/major version (unless you reopen the PR or upgrade to it). To ignore the version in this PR you can just close it - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a "Dependabot enabled" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired) Finally, you can contact us by mentioning @dependabot.
codecov[bot] commented 5 years ago

Codecov Report

Merging #64 into master will not change coverage. The diff coverage is n/a.

Impacted file tree graph

@@          Coverage Diff           @@
##           master     #64   +/-   ##
======================================
  Coverage    2.38%   2.38%           
======================================
  Files          11      11           
  Lines         294     294           
======================================
  Hits            7       7           
  Misses        287     287

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 306f4c7...23fe15c. Read the comment docs.

dependabot-preview[bot] commented 5 years ago

Superseded by #66.