Changelog
*Sourced from [scrapy's changelog](https://github.com/scrapy/scrapy/blob/master/docs/news.rst).*
> Scrapy 1.8.0 (2019-10-28)
> =========================
>
> Highlights:
>
> - Dropped Python 3.4 support and updated minimum requirements; made Python 3.8 support official
> - New Request.from\_curl <scrapy.http.Request.from\_curl> class method
> - New ROBOTSTXT\_PARSER and ROBOTSTXT\_USER\_AGENT settings
> - New DOWNLOADER\_CLIENT\_TLS\_CIPHERS and DOWNLOADER\_CLIENT\_TLS\_VERBOSE\_LOGGING settings
>
> Backward-incompatible changes
> -----------------------------
>
> - Python 3.4 is no longer supported, and some of the minimum requirements of Scrapy have also changed:
>
> - [cssselect]() 0.9.1
> - [cryptography]() 2.0
> - [lxml]() 3.5.0
> - [pyOpenSSL]() 16.2.0
> - [queuelib]() 1.4.2
> - [service\_identity]() 16.0.0
> - [six]() 1.10.0
> - [Twisted]() 17.9.0 (16.0.0 with Python 2)
> - [zope.interface]() 4.1.3
>
> (3892)
>
> - `JSONRequest` is now called \~scrapy.http.JsonRequest for consistency with similar classes (3929, 3982)
> - If you are using a custom context factory (DOWNLOADER\_CLIENTCONTEXTFACTORY), its `__init__` method must accept two new parameters: `tls_verbose_logging` and `tls_ciphers` (2111, 3392, 3442, 3450)
> - \~scrapy.loader.ItemLoader now turns the values of its input item into lists:
>
> >>> item = MyItem()
> >>> item['field'] = 'value1'
> >>> loader = ItemLoader(item=item)
> >>> item['field']
> ['value1']
>
> This is needed to allow adding values to existing fields (`loader.add_value('field', 'value2')`).
>
> (3804, 3819, 3897, 3976, 3998, 4036)
>
> See also 1.8-deprecation-removals below.
>
> New features
> ------------
>
> - A new Request.from\_curl <scrapy.http.Request.from\_curl> class method allows creating a request from a cURL command
> <requests-from-curl> (2985, 3862)
> - A new ROBOTSTXT\_PARSER setting allows choosing which [robots.txt]() parser to use. It includes built-in support for RobotFileParser <python-robotfileparser>, Protego <protego-parser> (default), Reppy <reppy-parser>, and Robotexclusionrulesparser <rerp-parser>, and allows you to implement support for additional parsers
> <support-for-new-robots-parser> (754, 2669, 3796, 3935, 3969, 4006)
> ... (truncated)
Commits
- [`be2e910`](https://github.com/scrapy/scrapy/commit/be2e910dd06ba4904e7b10eb5a7e3251e8dab099) Bump version: 1.7.0 → 1.8.0
- [`94f060f`](https://github.com/scrapy/scrapy/commit/94f060fcc84853f28f3f91b6dde1d61c8e19251e) Cover Scrapy 1.8.0 in the release notes ([#3952](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3952))
- [`18b808b`](https://github.com/scrapy/scrapy/commit/18b808b2e937d97df798fdd1f5dfabd68b8ce86b) Merge pull request [#4092](https://github-redirect.dependabot.com/scrapy/scrapy/issues/4092) from further-reading/master
- [`93e3dc1`](https://github.com/scrapy/scrapy/commit/93e3dc1b826e44d1a5a24fbb39c090ce426aa862) [test_downloadermiddleware_httpcache.py] Cleaning text
- [`b73d217`](https://github.com/scrapy/scrapy/commit/b73d217de5647a68c7b8dfda747cd3d0685c226d) [test_downloadermiddleware_httpcache.py] Fixing pytest mark behaviour
- [`7490903`](https://github.com/scrapy/scrapy/commit/74909030a55b59e3b858fc736b5b1f685d9596a6) [tox.ini] Removing obsolete py37 extra deps enviornment
- [`c51fb95`](https://github.com/scrapy/scrapy/commit/c51fb959e2985faf6f21fe7f03d2fb8160de064f) [test_downloadermiddleware_httpcache] Fixing pytest skip behaviour
- [`4432136`](https://github.com/scrapy/scrapy/commit/4432136ffff4d8af42f7a485c17ab7fbbb228078) [test_downloadermiddleware_httpcache] Fixing pytest skip behaviour
- [`9b47dc6`](https://github.com/scrapy/scrapy/commit/9b47dc6a703310d13c9470e50d4b14f81ee893c6) [travis, setup] Adding official python 3.8 support
- [`16bb3ac`](https://github.com/scrapy/scrapy/commit/16bb3ac20dae8b7c5fbccf4ab85b3a0393e7c55d) [test_downloadermiddleware_httpcache] Using skipif approach
- Additional commits viewable in [compare view](https://github.com/scrapy/scrapy/compare/1.7.4...1.8.0)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language
- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language
- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language
- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language
- `@dependabot badge me` will comment on this PR with code to add a "Dependabot enabled" badge to your readme
Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):
- Update frequency (including time of day and day of week)
- Pull request limits (per update run and/or open at any time)
- Automerge options (never/patch/minor, and dev/runtime dependencies)
- Out-of-range updates (receive only lockfile updates, if desired)
- Security updates (receive only security updates, if desired)
Bumps scrapy from 1.7.4 to 1.8.0.
Changelog
*Sourced from [scrapy's changelog](https://github.com/scrapy/scrapy/blob/master/docs/news.rst).* > Scrapy 1.8.0 (2019-10-28) > ========================= > > Highlights: > > - Dropped Python 3.4 support and updated minimum requirements; made Python 3.8 support official > - New Request.from\_curl <scrapy.http.Request.from\_curl> class method > - New ROBOTSTXT\_PARSER and ROBOTSTXT\_USER\_AGENT settings > - New DOWNLOADER\_CLIENT\_TLS\_CIPHERS and DOWNLOADER\_CLIENT\_TLS\_VERBOSE\_LOGGING settings > > Backward-incompatible changes > ----------------------------- > > - Python 3.4 is no longer supported, and some of the minimum requirements of Scrapy have also changed: > > - [cssselect]() 0.9.1 > - [cryptography]() 2.0 > - [lxml]() 3.5.0 > - [pyOpenSSL]() 16.2.0 > - [queuelib]() 1.4.2 > - [service\_identity]() 16.0.0 > - [six]() 1.10.0 > - [Twisted]() 17.9.0 (16.0.0 with Python 2) > - [zope.interface]() 4.1.3 > > (3892) > > - `JSONRequest` is now called \~scrapy.http.JsonRequest for consistency with similar classes (3929, 3982) > - If you are using a custom context factory (DOWNLOADER\_CLIENTCONTEXTFACTORY), its `__init__` method must accept two new parameters: `tls_verbose_logging` and `tls_ciphers` (2111, 3392, 3442, 3450) > - \~scrapy.loader.ItemLoader now turns the values of its input item into lists: > > >>> item = MyItem() > >>> item['field'] = 'value1' > >>> loader = ItemLoader(item=item) > >>> item['field'] > ['value1'] > > This is needed to allow adding values to existing fields (`loader.add_value('field', 'value2')`). > > (3804, 3819, 3897, 3976, 3998, 4036) > > See also 1.8-deprecation-removals below. > > New features > ------------ > > - A new Request.from\_curl <scrapy.http.Request.from\_curl> class method allows creating a request from a cURL command > <requests-from-curl> (2985, 3862) > - A new ROBOTSTXT\_PARSER setting allows choosing which [robots.txt]() parser to use. It includes built-in support for RobotFileParser <python-robotfileparser>, Protego <protego-parser> (default), Reppy <reppy-parser>, and Robotexclusionrulesparser <rerp-parser>, and allows you to implement support for additional parsers > <support-for-new-robots-parser> (754, 2669, 3796, 3935, 3969, 4006) > ... (truncated)Commits
- [`be2e910`](https://github.com/scrapy/scrapy/commit/be2e910dd06ba4904e7b10eb5a7e3251e8dab099) Bump version: 1.7.0 → 1.8.0 - [`94f060f`](https://github.com/scrapy/scrapy/commit/94f060fcc84853f28f3f91b6dde1d61c8e19251e) Cover Scrapy 1.8.0 in the release notes ([#3952](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3952)) - [`18b808b`](https://github.com/scrapy/scrapy/commit/18b808b2e937d97df798fdd1f5dfabd68b8ce86b) Merge pull request [#4092](https://github-redirect.dependabot.com/scrapy/scrapy/issues/4092) from further-reading/master - [`93e3dc1`](https://github.com/scrapy/scrapy/commit/93e3dc1b826e44d1a5a24fbb39c090ce426aa862) [test_downloadermiddleware_httpcache.py] Cleaning text - [`b73d217`](https://github.com/scrapy/scrapy/commit/b73d217de5647a68c7b8dfda747cd3d0685c226d) [test_downloadermiddleware_httpcache.py] Fixing pytest mark behaviour - [`7490903`](https://github.com/scrapy/scrapy/commit/74909030a55b59e3b858fc736b5b1f685d9596a6) [tox.ini] Removing obsolete py37 extra deps enviornment - [`c51fb95`](https://github.com/scrapy/scrapy/commit/c51fb959e2985faf6f21fe7f03d2fb8160de064f) [test_downloadermiddleware_httpcache] Fixing pytest skip behaviour - [`4432136`](https://github.com/scrapy/scrapy/commit/4432136ffff4d8af42f7a485c17ab7fbbb228078) [test_downloadermiddleware_httpcache] Fixing pytest skip behaviour - [`9b47dc6`](https://github.com/scrapy/scrapy/commit/9b47dc6a703310d13c9470e50d4b14f81ee893c6) [travis, setup] Adding official python 3.8 support - [`16bb3ac`](https://github.com/scrapy/scrapy/commit/16bb3ac20dae8b7c5fbccf4ab85b3a0393e7c55d) [test_downloadermiddleware_httpcache] Using skipif approach - Additional commits viewable in [compare view](https://github.com/scrapy/scrapy/compare/1.7.4...1.8.0)Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a "Dependabot enabled" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Automerge options (never/patch/minor, and dev/runtime dependencies) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)