If you use HttpAuthMiddleware (i.e. the http_user and http_pass spider attributes) for HTTP authentication, any request exposes your credentials to the request target.
To prevent unintended exposure of authentication credentials to unintended domains, you must now additionally set a new, additional spider attribute, http_auth_domain, and point it to the specific domain to which the authentication credentials must be sent.
If the http_auth_domain spider attribute is not set, the domain of the first request will be considered the HTTP authentication target, and authentication credentials will only be sent in requests targeting that domain.
If you need to send the same HTTP authentication credentials to multiple domains, you can use w3lib.http.basic_auth_header instead to set the value of the Authorization header of your requests.
If you really want your spider to send the same HTTP authentication credentials to any domain, set the http_auth_domain spider attribute to None.
Finally, if you are a user of scrapy-splash, know that this version of Scrapy breaks compatibility with scrapy-splash 0.7.2 and earlier. You will need to upgrade scrapy-splash to a greater version for it to continue to work.
1.7.4
Revert the fix for #3804 (#3819), which has a few undesired side effects (#3897, #3976).
1.7.3
Enforce lxml 4.3.5 or lower for Python 3.4 (#3912, #3918)
If you use
:class:~scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware
(i.e. the http_user and http_pass spider attributes) for HTTP
authentication, any request exposes your credentials to the request target.
To prevent unintended exposure of authentication credentials to unintended
domains, you must now additionally set a new, additional spider attribute,
http_auth_domain, and point it to the specific domain to which the
authentication credentials must be sent.
If the http_auth_domain spider attribute is not set, the domain of the
first request will be considered the HTTP authentication target, and
authentication credentials will only be sent in requests targeting that
domain.
If you need to send the same HTTP authentication credentials to multiple
domains, you can use :func:w3lib.http.basic_auth_header instead to
set the value of the Authorization header of your requests.
If you really want your spider to send the same HTTP authentication
credentials to any domain, set the http_auth_domain spider attribute
to None.
Finally, if you are a user of scrapy-splash_, know that this version of
Scrapy breaks compatibility with scrapy-splash 0.7.2 and earlier. You will
need to upgrade scrapy-splash to a greater version for it to continue to
work.
Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
- `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language
- `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language
- `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language
- `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language
You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/iAnanich/gismeteo-news-scraping/network/alerts).
Bumps scrapy from 1.4.0 to 1.8.1.
Release notes
Sourced from scrapy's releases.
... (truncated)
Changelog
Sourced from scrapy's changelog.
... (truncated)
Commits
283e90e
Bump version: 1.8.0 → 1.8.199ac4db
Cover 1.8.1 in the release notes1635134
Small documentation fixes.b01d69a
Add http_auth_domain to HttpAuthMiddleware.4183925
Travis CI → GitHub Actionsbe2e910
Bump version: 1.7.0 → 1.8.094f060f
Cover Scrapy 1.8.0 in the release notes (#3952)18b808b
Merge pull request #4092 from further-reading/master93e3dc1
[test_downloadermiddleware_httpcache.py] Cleaning textb73d217
[test_downloadermiddleware_httpcache.py] Fixing pytest mark behaviourDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/iAnanich/gismeteo-news-scraping/network/alerts).