Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar,/home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar,/home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar
*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the "Details" section below to see if there is a version of transitive dependency where vulnerability is fixed.
**In some cases, Remediation PR cannot be created automatically for a vulnerability despite the availability of remediation
Details
Partial details (19 vulnerabilities) are displayed below due to a content size limitation in GitHub. To view information on the remaining vulnerabilities, navigate to the Mend Application.
In Apache Hadoop, The unTar function uses unTarUsingJava function on Windows and the built-in tar utility on Unix and other OSes. As a result, a TAR entry may create a symlink under the expected extraction directory which points to an external directory. A subsequent TAR entry may extract an arbitrary file into the external directory using the symlink name. This however would be caught by the same targetDirPath check on Unix because of the getCanonicalPath call. However on Windows, getCanonicalPath doesn't resolve symbolic links, which bypasses the check. unpackEntries during TAR extraction follows symbolic links which allows writing outside expected base directory on Windows. This was addressed in Apache Hadoop 3.2.3
Apache Hadoop's FileUtil.unTar(File, File) API does not escape the input file name before being passed to the shell. An attacker can inject arbitrary commands. This is only used in Hadoop 3.3 InMemoryAliasMap.completeBootstrapTransfer, which is only ever run by a local user. It has been used in Hadoop 2.x for yarn localization, which does enable remote code execution. It is used in Apache Spark, from the SQL command ADD ARCHIVE. As the ADD ARCHIVE command adds new binaries to the classpath, being able to execute shell scripts does not confer new permissions to the caller. SPARK-38305. "Check existence of file before untarring/zipping", which is included in 3.3.0, 3.1.4, 3.2.2, prevents shell commands being executed, regardless of which version of the hadoop libraries are in use. Users should upgrade to Apache Hadoop 2.10.2, 3.2.4, 3.3.3 or upper (including HADOOP-18136).
By design, the JDBCAppender in Log4j 1.2.x accepts an SQL statement as a configuration parameter where the values to be inserted are converters from PatternLayout. The message converter, %m, is likely to always be included. This allows attackers to manipulate the SQL by entering crafted strings into input fields or headers of an application that are logged allowing unintended SQL queries to be executed. Note this issue only affects Log4j 1.x when specifically configured to use the JDBCAppender, which is not the default. Beginning in version 2.0-beta8, the JDBCAppender was re-introduced with proper support for parameterized SQL queries and further customization over the columns written to in logs. Apache Log4j 1.2 reached end of life in August 2015. Users should upgrade to Log4j 2 as it addresses numerous other issues from the previous versions.
Included in Log4j 1.2 is a SocketServer class that is vulnerable to deserialization of untrusted data which can be exploited to remotely execute arbitrary code when combined with a deserialization gadget when listening to untrusted network traffic for log data. This affects Log4j versions up to 1.2 up to 1.2.17.
Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.
HttpObjectDecoder.java in Netty before 4.1.44 allows a Content-Length header to be accompanied by a second Content-Length header, or by a Transfer-Encoding header.
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.
CVE-2019-20444
### Vulnerable Libraries - netty-all-4.0.23.Final.jar, netty-3.6.2.Final.jar
### netty-all-4.0.23.Final.jar
Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.
The Netty project is an effort to provide an asynchronous event-driven
network application framework and tools for rapid development of
maintainable high performance and high scalability protocol servers and
clients. In other words, Netty is a NIO client server framework which
enables quick and easy development of network applications such as protocol
servers and clients. It greatly simplifies and streamlines network
programming such as TCP and UDP socket server.
HttpObjectDecoder.java in Netty before 4.1.44 allows an HTTP header that lacks a colon, which might be interpreted as a separate header with an incorrect syntax, or might be interpreted as an "invalid fold."
CVE-2020-9493 identified a deserialization issue that was present in Apache Chainsaw. Prior to Chainsaw V2.0 Chainsaw was a component of Apache Log4j 1.2.x where the same issue exists.
JMSSink in all versions of Log4j 1.x is vulnerable to deserialization of untrusted data when the attacker has write access to the Log4j configuration or if the configuration references an LDAP service the attacker has access to. The attacker can provide a TopicConnectionFactoryBindingName configuration causing JMSSink to perform JNDI requests that result in remote code execution in a similar fashion to CVE-2021-4104. Note this issue only affects Log4j 1.x when specifically configured to use JMSSink, which is not the default. Apache Log4j 1.2 reached end of life in August 2015. Users should upgrade to Log4j 2 as it addresses numerous other issues from the previous versions.
In Apache Hadoop 3.2.0 to 3.2.1, 3.0.0-alpha1 to 3.1.3, and 2.0.0-alpha to 2.10.0, WebHDFS client might send SPNEGO authorization header to remote URL without proper verification.
In Apache Hadoop versions 3.0.0-alpha1 to 3.1.0, 2.9.0 to 2.9.1, and 2.2.0 to 2.8.4, a user who can escalate to yarn user can possibly run arbitrary commands as root user.
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.
CVE-2023-43642
### Vulnerable Library - snappy-java-1.1.1.3.jar
snappy-java: A fast compression/decompression library
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar,/home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar,/home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar
snappy-java is a Java port of the snappy, a fast C++ compresser/decompresser developed by Google. The SnappyInputStream was found to be vulnerable to Denial of Service (DoS) attacks when decompressing data with a too large chunk size. Due to missing upper bound check on chunk length, an unrecoverable fatal error can occur. All versions of snappy-java including the latest released version 1.1.10.3 are vulnerable to this issue. A fix has been introduced in commit `9f8c3cf74` which will be included in the 1.1.10.4 release. Users are advised to upgrade. Users unable to upgrade should only accept compressed data from trusted sources.
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.
CVE-2023-34455
### Vulnerable Library - snappy-java-1.1.1.3.jar
snappy-java: A fast compression/decompression library
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar,/home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar,/home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar
snappy-java is a fast compressor/decompressor for Java. Due to use of an unchecked chunk length, an unrecoverable fatal error can occur in versions prior to 1.1.10.1.
The code in the function hasNextChunk in the fileSnappyInputStream.java checks if a given stream has more chunks to read. It does that by attempting to read 4 bytes. If it wasn’t possible to read the 4 bytes, the function returns false. Otherwise, if 4 bytes were available, the code treats them as the length of the next chunk.
In the case that the `compressed` variable is null, a byte array is allocated with the size given by the input data. Since the code doesn’t test the legality of the `chunkSize` variable, it is possible to pass a negative number (such as 0xFFFFFFFF which is -1), which will cause the code to raise a `java.lang.NegativeArraySizeException` exception. A worse case would happen when passing a huge positive value (such as 0x7FFFFFFF), which would raise the fatal `java.lang.OutOfMemoryError` error.
Version 1.1.10.1 contains a patch for this issue.
** UNSUPPORTED WHEN ASSIGNED **
When using the Chainsaw or SocketAppender components with Log4j 1.x on JRE less than 1.7, an attacker that manages to cause a logging entry involving a specially-crafted (ie, deeply nested)
hashmap or hashtable (depending on which logging component is in use) to be processed could exhaust the available memory in the virtual machine and achieve Denial of Service when the object is deserialized.
This issue affects Apache Log4j before 2. Affected users are recommended to update to Log4j 2.x.
NOTE: This vulnerability only affects products that are no longer supported by the maintainer.
A parsing issue similar to CVE-2022-3171, but with textformat in protobuf-java core and lite versions prior to 3.21.7, 3.20.3, 3.19.6 and 3.16.3 can lead to a denial of service attack. Inputs containing multiple instances of non-repeated embedded messages with repeated or unknown fields causes objects to be converted back-n-forth between mutable and immutable forms, resulting in potentially long garbage collection pauses. We recommend updating to the versions mentioned above.
JMSAppender in Log4j 1.2 is vulnerable to deserialization of untrusted data when the attacker has write access to the Log4j configuration. The attacker can provide TopicBindingName and TopicConnectionFactoryBindingName configurations causing JMSAppender to perform JNDI requests that result in remote code execution in a similar fashion to CVE-2021-44228. Note this issue only affects Log4j 1.2 when specifically configured to use JMSAppender, which is not the default. Apache Log4j 1.2 reached end of life in August 2015. Users should upgrade to Log4j 2 as it addresses numerous other issues from the previous versions.
Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.
The Snappy frame decoder function doesn't restrict the chunk length which may lead to excessive memory usage. Beside this it also may buffer reserved skippable chunks until the whole chunk was received which may lead to excessive memory usage as well. This vulnerability can be triggered by supplying malicious input that decompresses to a very big size (via a network stream or a file) or by sending a huge skippable chunk.
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.
CVE-2021-37136
### Vulnerable Library - netty-all-4.0.23.Final.jar
Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.
The Bzip2 decompression decoder function doesn't allow setting size restrictions on the decompressed output data (which affects the allocation size used during decompression). All users of Bzip2Decoder are affected. The malicious input can trigger an OOME and so a DoS attack
An issue in protobuf-java allowed the interleaving of com.google.protobuf.UnknownFieldSet fields in such a way that would be processed out of order. A small malicious payload can occupy the parser for several minutes by creating large numbers of short-lived objects that cause frequent, repeated pauses. We recommend upgrading libraries beyond the vulnerable versions.
Vulnerable Library - hadoop-client-2.7.3.jar
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar,/home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar,/home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Vulnerabilities
*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the "Details" section below to see if there is a version of transitive dependency where vulnerability is fixed.
**In some cases, Remediation PR cannot be created automatically for a vulnerability despite the availability of remediation
Details
CVE-2022-26612
### Vulnerable Library - hadoop-common-2.7.3.jarApache Hadoop Common
Library home page: http://www.apache.org
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/hadoop/hadoop-common/2.7.3/hadoop-common-2.7.3.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - :x: **hadoop-common-2.7.3.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsIn Apache Hadoop, The unTar function uses unTarUsingJava function on Windows and the built-in tar utility on Unix and other OSes. As a result, a TAR entry may create a symlink under the expected extraction directory which points to an external directory. A subsequent TAR entry may extract an arbitrary file into the external directory using the symlink name. This however would be caught by the same targetDirPath check on Unix because of the getCanonicalPath call. However on Windows, getCanonicalPath doesn't resolve symbolic links, which bypasses the check. unpackEntries during TAR extraction follows symbolic links which allows writing outside expected base directory on Windows. This was addressed in Apache Hadoop 3.2.3
Publish Date: 2022-04-07
URL: CVE-2022-26612
### CVSS 3 Score Details (9.8)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://nvd.nist.gov/vuln/detail/CVE-2022-26612
Release Date: 2022-04-07
Fix Resolution: org.apache.hadoop:hadoop-common:3.2.3
CVE-2022-25168
### Vulnerable Library - hadoop-common-2.7.3.jarApache Hadoop Common
Library home page: http://www.apache.org
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/hadoop/hadoop-common/2.7.3/hadoop-common-2.7.3.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - :x: **hadoop-common-2.7.3.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsApache Hadoop's FileUtil.unTar(File, File) API does not escape the input file name before being passed to the shell. An attacker can inject arbitrary commands. This is only used in Hadoop 3.3 InMemoryAliasMap.completeBootstrapTransfer, which is only ever run by a local user. It has been used in Hadoop 2.x for yarn localization, which does enable remote code execution. It is used in Apache Spark, from the SQL command ADD ARCHIVE. As the ADD ARCHIVE command adds new binaries to the classpath, being able to execute shell scripts does not confer new permissions to the caller. SPARK-38305. "Check existence of file before untarring/zipping", which is included in 3.3.0, 3.1.4, 3.2.2, prevents shell commands being executed, regardless of which version of the hadoop libraries are in use. Users should upgrade to Apache Hadoop 2.10.2, 3.2.4, 3.3.3 or upper (including HADOOP-18136).
Publish Date: 2022-08-04
URL: CVE-2022-25168
### CVSS 3 Score Details (9.8)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://lists.apache.org/thread/mxqnb39jfrwgs3j6phwvlrfq4mlox130
Release Date: 2022-08-04
Fix Resolution (org.apache.hadoop:hadoop-common): 2.10.2
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.CVE-2022-23305
### Vulnerable Library - log4j-1.2.17.jarApache Log4j 1.2
Library home page: http://www.apache.org
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-common-2.7.3.jar - :x: **log4j-1.2.17.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsBy design, the JDBCAppender in Log4j 1.2.x accepts an SQL statement as a configuration parameter where the values to be inserted are converters from PatternLayout. The message converter, %m, is likely to always be included. This allows attackers to manipulate the SQL by entering crafted strings into input fields or headers of an application that are logged allowing unintended SQL queries to be executed. Note this issue only affects Log4j 1.x when specifically configured to use the JDBCAppender, which is not the default. Beginning in version 2.0-beta8, the JDBCAppender was re-introduced with proper support for parameterized SQL queries and further customization over the columns written to in logs. Apache Log4j 1.2 reached end of life in August 2015. Users should upgrade to Log4j 2 as it addresses numerous other issues from the previous versions.
Publish Date: 2022-01-18
URL: CVE-2022-23305
### CVSS 3 Score Details (9.8)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://reload4j.qos.ch/
Release Date: 2022-01-18
Fix Resolution: ch.qos.reload4j:reload4j:1.2.18.2
CVE-2020-9493
### Vulnerable Library - log4j-1.2.17.jarApache Log4j 1.2
Library home page: http://www.apache.org
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-common-2.7.3.jar - :x: **log4j-1.2.17.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsA deserialization flaw was found in Apache Chainsaw versions prior to 2.1.0 which could lead to malicious code execution.
Publish Date: 2021-06-16
URL: CVE-2020-9493
### CVSS 3 Score Details (9.8)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://www.openwall.com/lists/oss-security/2021/06/16/1
Release Date: 2021-06-16
Fix Resolution: ch.qos.reload4j:reload4j:1.2.18.1
CVE-2019-17571
### Vulnerable Library - log4j-1.2.17.jarApache Log4j 1.2
Library home page: http://www.apache.org
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-common-2.7.3.jar - :x: **log4j-1.2.17.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsIncluded in Log4j 1.2 is a SocketServer class that is vulnerable to deserialization of untrusted data which can be exploited to remotely execute arbitrary code when combined with a deserialization gadget when listening to untrusted network traffic for log data. This affects Log4j versions up to 1.2 up to 1.2.17.
Publish Date: 2019-12-20
URL: CVE-2019-17571
### CVSS 3 Score Details (9.8)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://lists.apache.org/thread.html/eea03d504b36e8f870e8321d908e1def1addda16adda04327fe7c125%40%3Cdev.logging.apache.org%3E
Release Date: 2019-12-20
Fix Resolution: log4j-manual - 1.2.17-16;log4j-javadoc - 1.2.17-16;log4j - 1.2.17-16,1.2.17-16
CVE-2019-20445
### Vulnerable Library - netty-all-4.0.23.Final.jarNetty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers and clients.
Library home page: http://netty.io/
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-all/4.0.23.Final/netty-all-4.0.23.Final.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-hdfs-2.7.3.jar - :x: **netty-all-4.0.23.Final.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsHttpObjectDecoder.java in Netty before 4.1.44 allows a Content-Length header to be accompanied by a second Content-Length header, or by a Transfer-Encoding header.
Publish Date: 2020-01-29
URL: CVE-2019-20445
### CVSS 3 Score Details (9.1)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: None
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-20445
Release Date: 2020-01-29
Fix Resolution (io.netty:netty-all): 4.1.44.Final
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.CVE-2019-20444
### Vulnerable Libraries - netty-all-4.0.23.Final.jar, netty-3.6.2.Final.jar### netty-all-4.0.23.Final.jar
Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers and clients.
Library home page: http://netty.io/
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-all/4.0.23.Final/netty-all-4.0.23.Final.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-hdfs-2.7.3.jar - :x: **netty-all-4.0.23.Final.jar** (Vulnerable Library) ### netty-3.6.2.Final.jar
The Netty project is an effort to provide an asynchronous event-driven network application framework and tools for rapid development of maintainable high performance and high scalability protocol servers and clients. In other words, Netty is a NIO client server framework which enables quick and easy development of network applications such as protocol servers and clients. It greatly simplifies and streamlines network programming such as TCP and UDP socket server.
Library home page: http://netty.io/
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty/3.6.2.Final/netty-3.6.2.Final.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-common-2.7.3.jar - hadoop-auth-2.7.3.jar - zookeeper-3.4.6.jar - :x: **netty-3.6.2.Final.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsHttpObjectDecoder.java in Netty before 4.1.44 allows an HTTP header that lacks a colon, which might be interpreted as a separate header with an incorrect syntax, or might be interpreted as an "invalid fold."
Publish Date: 2020-01-29
URL: CVE-2019-20444
### CVSS 3 Score Details (9.1)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: None
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-20444
Release Date: 2020-01-29
Fix Resolution (io.netty:netty-all): 4.1.44.Final
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
Fix Resolution (io.netty:netty): 4.1.44.Final
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.CVE-2022-23307
### Vulnerable Library - log4j-1.2.17.jarApache Log4j 1.2
Library home page: http://www.apache.org
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-common-2.7.3.jar - :x: **log4j-1.2.17.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsCVE-2020-9493 identified a deserialization issue that was present in Apache Chainsaw. Prior to Chainsaw V2.0 Chainsaw was a component of Apache Log4j 1.2.x where the same issue exists.
Publish Date: 2022-01-18
URL: CVE-2022-23307
### CVSS 3 Score Details (8.8)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Release Date: 2022-01-18
Fix Resolution: ch.qos.reload4j:reload4j:1.2.18.1
CVE-2022-23302
### Vulnerable Library - log4j-1.2.17.jarApache Log4j 1.2
Library home page: http://www.apache.org
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-common-2.7.3.jar - :x: **log4j-1.2.17.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsJMSSink in all versions of Log4j 1.x is vulnerable to deserialization of untrusted data when the attacker has write access to the Log4j configuration or if the configuration references an LDAP service the attacker has access to. The attacker can provide a TopicConnectionFactoryBindingName configuration causing JMSSink to perform JNDI requests that result in remote code execution in a similar fashion to CVE-2021-4104. Note this issue only affects Log4j 1.x when specifically configured to use JMSSink, which is not the default. Apache Log4j 1.2 reached end of life in August 2015. Users should upgrade to Log4j 2 as it addresses numerous other issues from the previous versions.
Publish Date: 2022-01-18
URL: CVE-2022-23302
### CVSS 3 Score Details (8.8)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://reload4j.qos.ch/
Release Date: 2022-01-18
Fix Resolution: ch.qos.reload4j:reload4j:1.2.18.1
CVE-2020-9492
### Vulnerable Library - hadoop-hdfs-2.7.3.jarApache Hadoop HDFS
Library home page: http://www.apache.org
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.7.3/hadoop-hdfs-2.7.3.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - :x: **hadoop-hdfs-2.7.3.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsIn Apache Hadoop 3.2.0 to 3.2.1, 3.0.0-alpha1 to 3.1.3, and 2.0.0-alpha to 2.10.0, WebHDFS client might send SPNEGO authorization header to remote URL without proper verification.
Publish Date: 2021-01-26
URL: CVE-2020-9492
### CVSS 3 Score Details (8.8)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://lists.apache.org/thread.html/rca4516b00b55b347905df45e5d0432186248223f30497db87aba8710@%3Cannounce.apache.org%3E
Release Date: 2024-09-03
Fix Resolution (org.apache.hadoop:hadoop-hdfs): 2.10.1
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.CVE-2018-8029
### Vulnerable Library - hadoop-common-2.7.3.jarApache Hadoop Common
Library home page: http://www.apache.org
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/hadoop/hadoop-common/2.7.3/hadoop-common-2.7.3.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - :x: **hadoop-common-2.7.3.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsIn Apache Hadoop versions 3.0.0-alpha1 to 3.1.0, 2.9.0 to 2.9.1, and 2.2.0 to 2.8.4, a user who can escalate to yarn user can possibly run arbitrary commands as root user.
Publish Date: 2019-05-30
URL: CVE-2018-8029
### CVSS 3 Score Details (8.8)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-8029
Release Date: 2019-05-30
Fix Resolution (org.apache.hadoop:hadoop-common): 2.8.5
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.CVE-2023-43642
### Vulnerable Library - snappy-java-1.1.1.3.jarsnappy-java: A fast compression/decompression library
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar,/home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar,/home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-common-2.7.3.jar - avro-1.8.1.jar - :x: **snappy-java-1.1.1.3.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability Detailssnappy-java is a Java port of the snappy, a fast C++ compresser/decompresser developed by Google. The SnappyInputStream was found to be vulnerable to Denial of Service (DoS) attacks when decompressing data with a too large chunk size. Due to missing upper bound check on chunk length, an unrecoverable fatal error can occur. All versions of snappy-java including the latest released version 1.1.10.3 are vulnerable to this issue. A fix has been introduced in commit `9f8c3cf74` which will be included in the 1.1.10.4 release. Users are advised to upgrade. Users unable to upgrade should only accept compressed data from trusted sources.
Publish Date: 2023-09-25
URL: CVE-2023-43642
### CVSS 3 Score Details (7.5)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://github.com/xerial/snappy-java/security/advisories/GHSA-55g7-9cwv-5qfv
Release Date: 2023-09-25
Fix Resolution (org.xerial.snappy:snappy-java): 1.1.10.4
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.CVE-2023-34455
### Vulnerable Library - snappy-java-1.1.1.3.jarsnappy-java: A fast compression/decompression library
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar,/home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar,/home/wss-scanner/.m2/repository/org/xerial/snappy/snappy-java/1.1.1.3/snappy-java-1.1.1.3.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-common-2.7.3.jar - avro-1.8.1.jar - :x: **snappy-java-1.1.1.3.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability Detailssnappy-java is a fast compressor/decompressor for Java. Due to use of an unchecked chunk length, an unrecoverable fatal error can occur in versions prior to 1.1.10.1. The code in the function hasNextChunk in the fileSnappyInputStream.java checks if a given stream has more chunks to read. It does that by attempting to read 4 bytes. If it wasn’t possible to read the 4 bytes, the function returns false. Otherwise, if 4 bytes were available, the code treats them as the length of the next chunk. In the case that the `compressed` variable is null, a byte array is allocated with the size given by the input data. Since the code doesn’t test the legality of the `chunkSize` variable, it is possible to pass a negative number (such as 0xFFFFFFFF which is -1), which will cause the code to raise a `java.lang.NegativeArraySizeException` exception. A worse case would happen when passing a huge positive value (such as 0x7FFFFFFF), which would raise the fatal `java.lang.OutOfMemoryError` error. Version 1.1.10.1 contains a patch for this issue.
Publish Date: 2023-06-15
URL: CVE-2023-34455
### CVSS 3 Score Details (7.5)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://github.com/xerial/snappy-java/security/advisories/GHSA-qcwq-55hx-v3vh
Release Date: 2023-06-15
Fix Resolution (org.xerial.snappy:snappy-java): 1.1.10.1
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.CVE-2023-26464
### Vulnerable Library - log4j-1.2.17.jarApache Log4j 1.2
Library home page: http://www.apache.org
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-common-2.7.3.jar - :x: **log4j-1.2.17.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability Details** UNSUPPORTED WHEN ASSIGNED ** When using the Chainsaw or SocketAppender components with Log4j 1.x on JRE less than 1.7, an attacker that manages to cause a logging entry involving a specially-crafted (ie, deeply nested) hashmap or hashtable (depending on which logging component is in use) to be processed could exhaust the available memory in the virtual machine and achieve Denial of Service when the object is deserialized. This issue affects Apache Log4j before 2. Affected users are recommended to update to Log4j 2.x. NOTE: This vulnerability only affects products that are no longer supported by the maintainer.
Publish Date: 2023-03-10
URL: CVE-2023-26464
### CVSS 3 Score Details (7.5)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://github.com/advisories/GHSA-vp98-w2p3-mv35
Release Date: 2023-03-10
Fix Resolution: org.apache.logging.log4j:log4j-core:2.0
CVE-2022-3509
### Vulnerable Library - protobuf-java-2.5.0.jarProtocol Buffers are a way of encoding structured data in an efficient yet extensible format.
Library home page: http://www.google.com/
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-common-2.7.3.jar - :x: **protobuf-java-2.5.0.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsA parsing issue similar to CVE-2022-3171, but with textformat in protobuf-java core and lite versions prior to 3.21.7, 3.20.3, 3.19.6 and 3.16.3 can lead to a denial of service attack. Inputs containing multiple instances of non-repeated embedded messages with repeated or unknown fields causes objects to be converted back-n-forth between mutable and immutable forms, resulting in potentially long garbage collection pauses. We recommend updating to the versions mentioned above.
Publish Date: 2022-11-01
URL: CVE-2022-3509
### CVSS 3 Score Details (7.5)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-3509
Release Date: 2022-11-01
Fix Resolution (com.google.protobuf:protobuf-java): 3.16.3
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.CVE-2021-4104
### Vulnerable Library - log4j-1.2.17.jarApache Log4j 1.2
Library home page: http://www.apache.org
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-common-2.7.3.jar - :x: **log4j-1.2.17.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsJMSAppender in Log4j 1.2 is vulnerable to deserialization of untrusted data when the attacker has write access to the Log4j configuration. The attacker can provide TopicBindingName and TopicConnectionFactoryBindingName configurations causing JMSAppender to perform JNDI requests that result in remote code execution in a similar fashion to CVE-2021-44228. Note this issue only affects Log4j 1.2 when specifically configured to use JMSAppender, which is not the default. Apache Log4j 1.2 reached end of life in August 2015. Users should upgrade to Log4j 2 as it addresses numerous other issues from the previous versions.
Publish Date: 2021-12-14
URL: CVE-2021-4104
### CVSS 3 Score Details (7.5)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://nvd.nist.gov/vuln/detail/CVE-2021-4104
Release Date: 2021-12-14
Fix Resolution: uom-parent - 1.0.3-3.module,1.0.3-3.module;uom-se-javadoc - 1.0.4-3.module;parfait-examples - 0.5.4-4.module;log4j-manual - 1.2.17-16;si-units-javadoc - 0.6.5-2.module;unit-api - 1.0-5.module,1.0-5.module;unit-api-javadoc - 1.0-5.module;parfait - 0.5.4-4.module,0.5.4-4.module;log4j-javadoc - 1.2.17-16;uom-systems-javadoc - 0.7-1.module;uom-lib-javadoc - 1.0.1-6.module;uom-systems - 0.7-1.module,0.7-1.module;log4j - 1.2.17-16,1.2.17-16;uom-se - 1.0.4-3.module,1.0.4-3.module;uom-lib - 1.0.1-6.module,1.0.1-6.module;parfait-javadoc - 0.5.4-4.module;pcp-parfait-agent - 0.5.4-4.module;si-units - 0.6.5-2.module,0.6.5-2.module
CVE-2021-37137
### Vulnerable Library - netty-all-4.0.23.Final.jarNetty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers and clients.
Library home page: http://netty.io/
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-all/4.0.23.Final/netty-all-4.0.23.Final.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-hdfs-2.7.3.jar - :x: **netty-all-4.0.23.Final.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsThe Snappy frame decoder function doesn't restrict the chunk length which may lead to excessive memory usage. Beside this it also may buffer reserved skippable chunks until the whole chunk was received which may lead to excessive memory usage as well. This vulnerability can be triggered by supplying malicious input that decompresses to a very big size (via a network stream or a file) or by sending a huge skippable chunk.
Publish Date: 2021-10-19
URL: CVE-2021-37137
### CVSS 3 Score Details (7.5)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://github.com/advisories/GHSA-9vjp-v76f-g363
Release Date: 2021-10-19
Fix Resolution (io.netty:netty-all): 4.1.68.Final
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.CVE-2021-37136
### Vulnerable Library - netty-all-4.0.23.Final.jarNetty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers and clients.
Library home page: http://netty.io/
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/io/netty/netty-all/4.0.23.Final/netty-all-4.0.23.Final.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-hdfs-2.7.3.jar - :x: **netty-all-4.0.23.Final.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsThe Bzip2 decompression decoder function doesn't allow setting size restrictions on the decompressed output data (which affects the allocation size used during decompression). All users of Bzip2Decoder are affected. The malicious input can trigger an OOME and so a DoS attack
Publish Date: 2021-10-19
URL: CVE-2021-37136
### CVSS 3 Score Details (7.5)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://github.com/netty/netty/security/advisories/GHSA-grg4-wf29-r9vv
Release Date: 2021-10-19
Fix Resolution (io.netty:netty-all): 4.1.68.Final
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.CVE-2021-22569
### Vulnerable Library - protobuf-java-2.5.0.jarProtocol Buffers are a way of encoding structured data in an efficient yet extensible format.
Library home page: http://www.google.com/
Path to dependency file: /nifi-nar-bundles/nifi-hadoop-libraries-bundle/nifi-hadoop-libraries-nar/pom.xml
Path to vulnerable library: /home/wss-scanner/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar
Dependency Hierarchy: - hadoop-client-2.7.3.jar (Root Library) - hadoop-common-2.7.3.jar - :x: **protobuf-java-2.5.0.jar** (Vulnerable Library)
Found in HEAD commit: 0707e245fb382da58db8bb8ec5ccff5d9ae55c39
Found in base branch: master
### Vulnerability DetailsAn issue in protobuf-java allowed the interleaving of com.google.protobuf.UnknownFieldSet fields in such a way that would be processed out of order. A small malicious payload can occupy the parser for several minutes by creating large numbers of short-lived objects that cause frequent, repeated pauses. We recommend upgrading libraries beyond the vulnerable versions.
Publish Date: 2022-01-07
URL: CVE-2021-22569
### CVSS 3 Score Details (7.5)Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High
For more information on CVSS3 Scores, click here. ### Suggested FixType: Upgrade version
Origin: https://github.com/advisories/GHSA-wrvw-hg22-4m67
Release Date: 2022-01-07
Fix Resolution (com.google.protobuf:protobuf-java): 3.16.1
Direct dependency fix Resolution (org.apache.hadoop:hadoop-client): 2.7.4
:rescue_worker_helmet: Automatic Remediation will be attempted for this issue.:rescue_worker_helmet:Automatic Remediation will be attempted for this issue.