eliangcs / pystock-crawler

(UNMAINTAINED) Crawl and parse financial reports (XBRL) from SEC EDGAR, and daily stock prices from Yahoo Finance
MIT License
311 stars 100 forks source link

Bump scrapy from 0.24.4 to 2.11.2 #30

Open dependabot[bot] opened 6 months ago

dependabot[bot] commented 6 months ago

Bumps scrapy from 0.24.4 to 2.11.2.

Release notes

Sourced from scrapy's releases.

2.11.2

Mostly bug fixes, including security bug fixes.

See the full changelog.

2.11.1

  • Security bug fixes.
  • Support for Twisted >= 23.8.0.
  • Documentation improvements.

See the full changelog.

2.11.0

  • Spiders can now modify settings in their from_crawler methods, e.g. based on spider arguments.
  • Periodic logging of stats.
  • Bug fixes.

See the full changelog.

2.10.1

Marked Twisted >= 23.8.0 as unsupported.

2.10.0

  • Added Python 3.12 support, dropped Python 3.7 support.
  • The new add-ons framework simplifies configuring 3rd-party components that support it.
  • Exceptions to retry can now be configured.
  • Many fixes and improvements for feed exports.

See the full changelog.

2.9.0

  • Per-domain download settings.
  • Compatibility with new cryptography and new parsel.
  • JMESPath selectors from the new parsel.
  • Bug fixes.

See the full changelog.

2.8.0

This is a maintenance release, with minor features, bug fixes, and cleanups.

See the full changelog.

2.7.1

  • Relaxed the restriction introduced in 2.6.2 so that the Proxy-Authentication header can again be set explicitly in certain cases, restoring compatibility with scrapy-zyte-smartproxy 2.1.0 and older
  • Bug fixes

See the full changelog

2.7.0

... (truncated)

Changelog

Sourced from scrapy's changelog.

Scrapy 2.11.2 (2024-05-14)

Security bug fixes


-   Redirects to non-HTTP protocols are no longer followed. Please, see the
    `23j4-mw76-5v7h security advisory`_ for more information. (:issue:`457`)
.. _23j4-mw76-5v7h security advisory: https://github.com/scrapy/scrapy/security/advisories/GHSA-23j4-mw76-5v7h
  • The Authorization header is now dropped on redirects to a different scheme (http:// or https://) or port, even if the domain is the same. Please, see the 4qqq-9vqf-3h3f security advisory_ for more information.

    .. _4qqq-9vqf-3h3f security advisory: https://github.com/scrapy/scrapy/security/advisories/GHSA-4qqq-9vqf-3h3f

  • When using system proxy settings that are different for http:// and https://, redirects to a different URL scheme will now also trigger the corresponding change in proxy settings for the redirected request. Please, see the jm3v-qxmh-hxwv security advisory_ for more information. (:issue:767)

    .. _jm3v-qxmh-hxwv security advisory: https://github.com/scrapy/scrapy/security/advisories/GHSA-jm3v-qxmh-hxwv

  • :attr:Spider.allowed_domains <scrapy.Spider.allowed_domains> is now enforced for all requests, and not only requests from spider callbacks. (:issue:1042, :issue:2241, :issue:6358)

  • :func:~scrapy.utils.iterators.xmliter_lxml no longer resolves XML entities. (:issue:6265)

  • defusedxml_ is now used to make :class:scrapy.http.request.rpc.XmlRpcRequest more secure. (:issue:6250, :issue:6251)

    .. _defusedxml: https://github.com/tiran/defusedxml

Bug fixes


-   Restored support for brotlipy_, which had been dropped in Scrapy 2.11.1 in
    favor of brotli_. (:issue:`6261`)

    .. _brotli: https://github.com/google/brotli

    .. note:: brotlipy is deprecated, both in Scrapy and upstream. Use brotli
        instead if you can.

</tr></table> 
</code></pre>
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>

<ul>
<li><a href="https://github.com/scrapy/scrapy/commit/e8cb5a03b382b98f2c8945355076390f708b918d"><code>e8cb5a0</code></a> Bump version: 2.11.1 → 2.11.2</li>
<li><a href="https://github.com/scrapy/scrapy/commit/2c031f4061ae9bf486cc9e2a699355450638e8c2"><code>2c031f4</code></a> Set the release date of 2.11.2</li>
<li><a href="https://github.com/scrapy/scrapy/commit/3ffa17c0204deb3bdf2c7c60f5a56c9f777698c6"><code>3ffa17c</code></a> Use posargs for pypy3-pinned</li>
<li><a href="https://github.com/scrapy/scrapy/commit/c6a8f0e4d945622a7e71adf635e272b66eddbbd0"><code>c6a8f0e</code></a> Update VERSION references</li>
<li><a href="https://github.com/scrapy/scrapy/commit/60d2577284128cd0cf4af54745730da4a9005177"><code>60d2577</code></a> Merge remote-tracking branch '23j4/2.11.2-release-notes' into 2.11</li>
<li><a href="https://github.com/scrapy/scrapy/commit/36287cb665ab4b0c65fd53181c9a0ef04990ada6"><code>36287cb</code></a> Merge branch 'redirect-protocols' into 2.11</li>
<li><a href="https://github.com/scrapy/scrapy/commit/f138d5d1450ef38ee077c2472c136c70d8d673e8"><code>f138d5d</code></a> Merge branch 'environ-proxy-protocol' into 2.11</li>
<li><a href="https://github.com/scrapy/scrapy/commit/1d0502f25bbe55a22899af915623fda1aaeb9dd8"><code>1d0502f</code></a> Merge branch 'advisory-fix' into 2.11</li>
<li><a href="https://github.com/scrapy/scrapy/commit/bb948af00babe545a7fb52700f4ba1424d206677"><code>bb948af</code></a> Release notes for 2.11.2 (<a href="https://redirect.github.com/scrapy/scrapy/issues/6359">#6359</a>)</li>
<li><a href="https://github.com/scrapy/scrapy/commit/5ad9433dd59cd8436ce33bf2c44796516eef4c3c"><code>5ad9433</code></a> Merge remote-tracking branch 'scrapy/2.11' into 2.11</li>
<li>Additional commits viewable in <a href="https://github.com/scrapy/scrapy/compare/0.24.4...2.11.2">compare view</a></li>
</ul>
</details>

<br />
[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=scrapy&package-manager=pip&previous-version=0.24.4&new-version=2.11.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/eliangcs/pystock-crawler/network/alerts).