PXMYH / helios

Real Estate Rental Info Data Collector
https://beast-helios.herokuapp.com/
MIT License
2 stars 2 forks source link

Bump scrapy from 1.6.0 to 1.7.3 #72

Closed dependabot-preview[bot] closed 5 years ago

dependabot-preview[bot] commented 5 years ago

Bumps scrapy from 1.6.0 to 1.7.3.

Release notes *Sourced from [scrapy's releases](https://github.com/scrapy/scrapy/releases).* > ## 1.7.3 > Enforce lxml 4.3.5 or lower for Python 3.4 ([#3912](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3912), [#3918](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3918)) > > ## 1.7.2 > Fix Python 2 support ([#3889](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3889), [#3893](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3893), [#3896](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3896)) > > ## 1.7.0 > Highlights: > > * Improvements for crawls targeting multiple domains > * A cleaner way to pass arguments to callbacks > * A new class for JSON requests > * Improvements for rule-based spiders > * New features for feed exports > > [See the full change log](https://docs.scrapy.org/en/1.7/news.html)
Changelog *Sourced from [scrapy's changelog](https://github.com/scrapy/scrapy/blob/master/docs/news.rst).* > Scrapy 1.7.3 (2019-08-01) > ========================= > > Enforce lxml 4.3.5 or lower for Python 3.4 (3912, 3918). > > Scrapy 1.7.2 (2019-07-23) > ========================= > > Fix Python 2 support (3889, 3893, 3896). > > Scrapy 1.7.1 (2019-07-18) > ========================= > > Re-packaging of Scrapy 1.7.0, which was missing some changes in PyPI. > > Scrapy 1.7.0 (2019-07-18) > ========================= > >
> > Make sure you install Scrapy 1.7.1. The Scrapy 1.7.0 package in PyPI > > : is the result of an erroneous commit tagging and does not include all the changes described below. > >
> > Highlights: > > - Improvements for crawls targeting multiple domains > - A cleaner way to pass arguments to callbacks > - A new class for JSON requests > - Improvements for rule-based spiders > - New features for feed exports > > Backward-incompatible changes > ----------------------------- > > - `429` is now part of the RETRY\_HTTP\_CODES setting by default > > This change is **backward incompatible**. If you don’t want to retry `429`, you must override RETRY\_HTTP\_CODES accordingly. > > - \~scrapy.crawler.Crawler, CrawlerRunner.crawl <scrapy.crawler.CrawlerRunner.crawl> and CrawlerRunner.create\_crawler <scrapy.crawler.CrawlerRunner.create\_crawler> no longer accept a \~scrapy.spiders.Spider subclass instance, they only accept a \~scrapy.spiders.Spider subclass now. > > \~scrapy.spiders.Spider subclass instances were never meant to work, and they were not working as one would expect: instead of using the passed \~scrapy.spiders.Spider subclass instance, their \~scrapy.spiders.Spider.from\_crawler method was called to generate a new instance. > > - Non-default values for the SCHEDULER\_PRIORITY\_QUEUE setting may stop working. Scheduler priority queue classes now need to handle \~scrapy.http.Request objects instead of arbitrary Python data structures. > > See also 1.7-deprecation-removals below. > > New features > ... (truncated)
Commits - [`e22a8c8`](https://github.com/scrapy/scrapy/commit/e22a8c8c36e34ffaf12ef9e330624df654582605) Bump version: 1.7.2 → 1.7.3 - [`ee371cb`](https://github.com/scrapy/scrapy/commit/ee371cbcd8b6070e4bda9d7b03cca49c7cfe3d53) Cover Scrapy 1.7.3 in the release notes - [`0287701`](https://github.com/scrapy/scrapy/commit/0287701dce827efa500145d9c770920b80132896) Pin Travis-ci build environment to previous default: Trusty - [`c5b8cc4`](https://github.com/scrapy/scrapy/commit/c5b8cc4a87e5ae9fb88459532ce7d37d4b2efdb5) Merge pull request [#3918](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3918) from rennerocha/fix-lxml-py34 - [`5c916be`](https://github.com/scrapy/scrapy/commit/5c916bebe84aa08f3ca5bcedb685451fd737d6f4) Added constrain on lxml version based on Python version - [`1a289c1`](https://github.com/scrapy/scrapy/commit/1a289c15f8deb416efdb5f72742ec5416d751d98) Bump version: 1.7.1 → 1.7.2 - [`d0288da`](https://github.com/scrapy/scrapy/commit/d0288da221779a27f0e63f3f6c3c85ad488687ee) Merge pull request [#3898](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3898) from Gallaecio/1.7 - [`77beab3`](https://github.com/scrapy/scrapy/commit/77beab37968b2347f1e06fa63ed8881681330b80) Cover Scrapy 1.7.2 in the release notes - [`c51cbcf`](https://github.com/scrapy/scrapy/commit/c51cbcfe813a655f47da30af2c760a32991e02d3) Fix ConfigParser import in py2 - [`079164c`](https://github.com/scrapy/scrapy/commit/079164c1b95b07a3923048e83eef79c58e184f8e) Cover Scrapy 1.7.1 in the release notes - Additional commits viewable in [compare view](https://github.com/scrapy/scrapy/compare/1.6.0...1.7.3)


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot ignore this [patch|minor|major] version` will close this PR and stop Dependabot creating any more for this minor/major version (unless you reopen the PR or upgrade to it). To ignore the version in this PR you can just close it - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a "Dependabot enabled" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired) Finally, you can contact us by mentioning @dependabot.
codecov[bot] commented 5 years ago

Codecov Report

Merging #72 into master will not change coverage. The diff coverage is n/a.

Impacted file tree graph

@@          Coverage Diff           @@
##           master     #72   +/-   ##
======================================
  Coverage    2.38%   2.38%           
======================================
  Files          11      11           
  Lines         294     294           
======================================
  Hits            7       7           
  Misses        287     287

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update bf3b072...860876a. Read the comment docs.