ZacharyDonnelly / RawBase

Online code editor and code storage
0 stars 0 forks source link

chore(deps): update dependency werkzeug to v2.3.8 [security] #228

Open renovate[bot] opened 3 months ago

renovate[bot] commented 3 months ago

This PR contains the following updates:

Package Change Age Adoption Passing Confidence
Werkzeug (changelog) ==2.0.3 -> ==2.3.8 age adoption passing confidence

GitHub Vulnerability Alerts

CVE-2023-23934

Browsers may allow "nameless" cookies that look like =value instead of key=value. A vulnerable browser may allow a compromised application on an adjacent subdomain to exploit this to set a cookie like =__Host-test=bad for another subdomain.

Werkzeug <= 2.2.2 will parse the cookie =__Host-test=bad as __Host-test=bad. If a Werkzeug application is running next to a vulnerable or malicious subdomain which sets such a cookie using a vulnerable browser, the Werkzeug application will see the bad cookie value but the valid cookie key.

CVE-2023-25577

Werkzeug's multipart form data parser will parse an unlimited number of parts, including file parts. Parts can be a small amount of bytes, but each requires CPU time to parse and may use more memory as Python data. If a request can be made to an endpoint that accesses request.data, request.form, request.files, or request.get_data(parse_form_data=False), it can cause unexpectedly high resource usage.

This allows an attacker to cause a denial of service by sending crafted multipart data to an endpoint that will parse it. The amount of CPU time required can block worker processes from handling legitimate requests. The amount of RAM required can trigger an out of memory kill of the process. Unlimited file parts can use up memory and file handles. If many concurrent requests are sent continuously, this can exhaust or kill all available workers.

CVE-2023-46136

Werkzeug multipart data parser needs to find a boundary that may be between consecutive chunks. That's why parsing is based on looking for newline characters. Unfortunately, code looking for partial boundary in the buffer is written inefficiently, so if we upload a file that starts with CR or LF and then is followed by megabytes of data without these characters: all of these bytes are appended chunk by chunk into internal bytearray and lookup for boundary is performed on growing buffer.

This allows an attacker to cause a denial of service by sending crafted multipart data to an endpoint that will parse it. The amount of CPU time required can block worker processes from handling legitimate requests. The amount of RAM required can trigger an out of memory kill of the process. If many concurrent requests are sent continuously, this can exhaust or kill all available workers.


Configuration

📅 Schedule: Branch creation - "" (UTC), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about these updates again.



This PR was generated by Mend Renovate. View the repository job log.