Release notes
*Sourced from [scrapy's releases](https://github.com/scrapy/scrapy/releases).*
> ## 1.6.0
> Highlights:
>
> - Better Windows support
> - Python 3.7 compatibility
> - Big documentation improvements, including a switch from .extract_first() + .extract() API to .get() + .getall() API
> - Feed exports, FilePipeline and MediaPipeline improvements
> - Better extensibility: item_error and request_reached_downloader signals; from_crawler support for feed exporters, feed storages and dupefilters.
> - scrapy.contracts fixes and new features
> - Telnet console security improvements, first released as a backport in Scrapy 1.5.2 (2019-01-22)
> - Clean-up of the deprecated code
> - Various bug fixes, small new features and usability improvements across the codebase.
>
> [Full changelog is in the docs](https://docs.scrapy.org/en/latest/news.html#scrapy-1-6-0-2019-01-30).
Changelog
*Sourced from [scrapy's changelog](https://github.com/scrapy/scrapy/blob/master/docs/news.rst).*
> Scrapy 1.6.0 (2019-01-30)
> =========================
>
> Highlights:
>
> - better Windows support;
> - Python 3.7 compatibility;
> - big documentation improvements, including a switch from `.extract_first()` + `.extract()` API to `.get()` + `.getall()` API;
> - feed exports, FilePipeline and MediaPipeline improvements;
> - better extensibility: item\_error and request\_reached\_downloader signals; `from_crawler` support for feed exporters, feed storages and dupefilters.
> - `scrapy.contracts` fixes and new features;
> - telnet console security improvements, first released as a backport in release-1.5.2;
> - clean-up of the deprecated code;
> - various bug fixes, small new features and usability improvements across the codebase.
>
> Selector API changes
> --------------------
>
> While these are not changes in Scrapy itself, but rather in the [parsel]() library which Scrapy uses for xpath/css selectors, these changes are worth mentioning here. Scrapy now depends on parsel >= 1.5, and Scrapy documentation is updated to follow recent `parsel` API conventions.
>
> Most visible change is that `.get()` and `.getall()` selector methods are now preferred over `.extract_first()` and `.extract()`. We feel that these new methods result in a more concise and readable code. See old-extraction-api for more details.
>
>
>
> There are currently **no plans** to deprecate `.extract()`
>
> : and `.extract_first()` methods.
>
>
>
> Another useful new feature is the introduction of `Selector.attrib` and `SelectorList.attrib` properties, which make it easier to get attributes of HTML elements. See selecting-attributes.
>
> CSS selectors are cached in parsel >= 1.5, which makes them faster when the same CSS path is used many times. This is very common in case of Scrapy spiders: callbacks are usually called several times, on different pages.
>
> If you're using custom `Selector` or `SelectorList` subclasses, a **backward incompatible** change in parsel may affect your code. See [parsel changelog](https://parsel.readthedocs.io/en/latest/history.html) for a detailed description, as well as for the full list of improvements.
>
> Telnet console
> --------------
>
> **Backward incompatible**: Scrapy's telnet console now requires username and password. See topics-telnetconsole for more details. This change fixes a **security issue**; see release-1.5.2 release notes for details.
>
> New extensibility features
> --------------------------
>
> - `from_crawler` support is added to feed exporters and feed storages. This, among other things, allows to access Scrapy settings from custom feed storages and exporters (1605, 3348).
> - `from_crawler` support is added to dupefilters (2956); this allows to access e.g. settings or a spider from a dupefilter.
> - item\_error is fired when an error happens in a pipeline (3256);
> - request\_reached\_downloader is fired when Downloader gets a new Request; this signal can be useful e.g. for custom Schedulers (3393).
> - new SitemapSpider \~.SitemapSpider.sitemap\_filter method which allows to select sitemap entries based on their attributes in SitemapSpider subclasses (3512).
> - Lazy loading of Downloader Handlers is now optional; this enables better initialization error handling in custom Downloader Handlers (3394).
> ... (truncated)
Commits
- [`b859435`](https://github.com/scrapy/scrapy/commit/b8594353d03be5574f51766c35566b713584302b) Bump version: 1.5.0 → 1.6.0
- [`1312174`](https://github.com/scrapy/scrapy/commit/1312174607ea6bde87008606aee505aa68cb2154) Merge pull request [#3549](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3549) from scrapy/release-notes-1.6
- [`91791cd`](https://github.com/scrapy/scrapy/commit/91791cd329936ee6ac53523460f9b72c20c66afb) DOC final changelog cleanups
- [`2c8c8b2`](https://github.com/scrapy/scrapy/commit/2c8c8b2dd8683787826713ed1d0fbfb2ec1af04a) DOC fix after bad merge - remove duplicate entries in changelog
- [`0fc9d70`](https://github.com/scrapy/scrapy/commit/0fc9d705c271f5d87174143c09f95993e5a45797) DOC mention that telnet security improvements happened in 1.5.2
- [`4cf4dd1`](https://github.com/scrapy/scrapy/commit/4cf4dd1d3e068e0df32f700c89d833cc7cd79b85) DOC add recent changes to changelog
- [`638469f`](https://github.com/scrapy/scrapy/commit/638469f9efdcc104f7b1a1c1a9890694e0d41c68) DOC extract_first/extract matches get/getall better
- [`e479f5a`](https://github.com/scrapy/scrapy/commit/e479f5aa15809e7f75a7dbc20d0629f57be46b5d) DOC update changelog
- [`7069107`](https://github.com/scrapy/scrapy/commit/706910790b6ee755bafa828606e215e668af3eee) [wip] draft 1.6 release notes
- [`b5026b8`](https://github.com/scrapy/scrapy/commit/b5026b842c8e17ac7f28b28b5fb1c72db2ca8a7f) Merge pull request [#3544](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3544) from joaquingx/fix-item-pipeline-x
- Additional commits viewable in [compare view](https://github.com/scrapy/scrapy/compare/1.5.1...1.6.0)
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.
Dependabot will merge this PR once CI passes on it, as requested by @nstapelbroek.
Note: This repo was added to Dependabot recently, so you'll receive a maximum of 5 PRs for your first few update runs. Once an update run creates fewer than 5 PRs we'll remove that limit.
You can always request more updates by clicking Bump now in your Dependabot dashboard.
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot ignore this [patch|minor|major] version` will close this PR and stop Dependabot creating any more for this minor/major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language
- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language
- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language
- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language
- `@dependabot badge me` will comment on this PR with code to add a "Dependabot enabled" badge to your readme
Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com):
- Update frequency (including time of day and day of week)
- Automerge options (never/patch/minor, and dev/runtime dependencies)
- Pull request limits (per update run and/or open at any time)
- Out-of-range updates (receive only lockfile updates, if desired)
- Security updates (receive only security updates, if desired)
Finally, you can contact us by mentioning @dependabot.
Bumps scrapy from 1.5.1 to 1.6.0.
Release notes
*Sourced from [scrapy's releases](https://github.com/scrapy/scrapy/releases).* > ## 1.6.0 > Highlights: > > - Better Windows support > - Python 3.7 compatibility > - Big documentation improvements, including a switch from .extract_first() + .extract() API to .get() + .getall() API > - Feed exports, FilePipeline and MediaPipeline improvements > - Better extensibility: item_error and request_reached_downloader signals; from_crawler support for feed exporters, feed storages and dupefilters. > - scrapy.contracts fixes and new features > - Telnet console security improvements, first released as a backport in Scrapy 1.5.2 (2019-01-22) > - Clean-up of the deprecated code > - Various bug fixes, small new features and usability improvements across the codebase. > > [Full changelog is in the docs](https://docs.scrapy.org/en/latest/news.html#scrapy-1-6-0-2019-01-30).Changelog
*Sourced from [scrapy's changelog](https://github.com/scrapy/scrapy/blob/master/docs/news.rst).* > Scrapy 1.6.0 (2019-01-30) > ========================= > > Highlights: > > - better Windows support; > - Python 3.7 compatibility; > - big documentation improvements, including a switch from `.extract_first()` + `.extract()` API to `.get()` + `.getall()` API; > - feed exports, FilePipeline and MediaPipeline improvements; > - better extensibility: item\_error and request\_reached\_downloader signals; `from_crawler` support for feed exporters, feed storages and dupefilters. > - `scrapy.contracts` fixes and new features; > - telnet console security improvements, first released as a backport in release-1.5.2; > - clean-up of the deprecated code; > - various bug fixes, small new features and usability improvements across the codebase. > > Selector API changes > -------------------- > > While these are not changes in Scrapy itself, but rather in the [parsel]() library which Scrapy uses for xpath/css selectors, these changes are worth mentioning here. Scrapy now depends on parsel >= 1.5, and Scrapy documentation is updated to follow recent `parsel` API conventions. > > Most visible change is that `.get()` and `.getall()` selector methods are now preferred over `.extract_first()` and `.extract()`. We feel that these new methods result in a more concise and readable code. See old-extraction-api for more details. > >Commits
- [`b859435`](https://github.com/scrapy/scrapy/commit/b8594353d03be5574f51766c35566b713584302b) Bump version: 1.5.0 → 1.6.0 - [`1312174`](https://github.com/scrapy/scrapy/commit/1312174607ea6bde87008606aee505aa68cb2154) Merge pull request [#3549](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3549) from scrapy/release-notes-1.6 - [`91791cd`](https://github.com/scrapy/scrapy/commit/91791cd329936ee6ac53523460f9b72c20c66afb) DOC final changelog cleanups - [`2c8c8b2`](https://github.com/scrapy/scrapy/commit/2c8c8b2dd8683787826713ed1d0fbfb2ec1af04a) DOC fix after bad merge - remove duplicate entries in changelog - [`0fc9d70`](https://github.com/scrapy/scrapy/commit/0fc9d705c271f5d87174143c09f95993e5a45797) DOC mention that telnet security improvements happened in 1.5.2 - [`4cf4dd1`](https://github.com/scrapy/scrapy/commit/4cf4dd1d3e068e0df32f700c89d833cc7cd79b85) DOC add recent changes to changelog - [`638469f`](https://github.com/scrapy/scrapy/commit/638469f9efdcc104f7b1a1c1a9890694e0d41c68) DOC extract_first/extract matches get/getall better - [`e479f5a`](https://github.com/scrapy/scrapy/commit/e479f5aa15809e7f75a7dbc20d0629f57be46b5d) DOC update changelog - [`7069107`](https://github.com/scrapy/scrapy/commit/706910790b6ee755bafa828606e215e668af3eee) [wip] draft 1.6 release notes - [`b5026b8`](https://github.com/scrapy/scrapy/commit/b5026b842c8e17ac7f28b28b5fb1c72db2ca8a7f) Merge pull request [#3544](https://github-redirect.dependabot.com/scrapy/scrapy/issues/3544) from joaquingx/fix-item-pipeline-x - Additional commits viewable in [compare view](https://github.com/scrapy/scrapy/compare/1.5.1...1.6.0)Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot will merge this PR once CI passes on it, as requested by @nstapelbroek.
Note: This repo was added to Dependabot recently, so you'll receive a maximum of 5 PRs for your first few update runs. Once an update run creates fewer than 5 PRs we'll remove that limit.
You can always request more updates by clicking
Bump now
in your Dependabot dashboard.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot ignore this [patch|minor|major] version` will close this PR and stop Dependabot creating any more for this minor/major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a "Dependabot enabled" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Automerge options (never/patch/minor, and dev/runtime dependencies) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired) Finally, you can contact us by mentioning @dependabot.