IBMStreams / streamsx.eventstore

Toolkit for IBM Db2 Event Store integration.
https://ibmstreams.github.io/streamsx.eventstore/
Apache License 2.0
2 stars 3 forks source link

Update Spark version to at least v2.2.0 (VULNERABILITIES) #59

Closed schubon closed 5 years ago

schubon commented 5 years ago

Issue

6 org.apache.spark:spark-core_2.11 vulnerabilities found in com.ibm.streamsx.eventstore/pom.xml

Remediation

Upgrade org.apache.spark:spark-core_2.11 to version 2.2.0 or later. For example:

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>[2.2.0,)</version>
</dependency>

Always verify the validity and compatibility of suggestions with your codebase. Please note that not every vulnerability below has a patched version available.

Details

CVE-2018-8024

More information low severity Vulnerable versions: >= 2.1.0, < 2.1.3 Patched version: 2.1.3

In Apache Spark 2.1.0 to 2.1.2, 2.2.0 to 2.2.1, and 2.3.0, it's possible for a malicious user to construct a URL pointing to a Spark cluster's UI's job and stage info pages, and if a user can be tricked into accessing the URL, can be used to cause script to execute and expose information from the user's view of the Spark UI. While some browsers like recent versions of Chrome and Safari are able to block this type of attack, current versions of Firefox (and possibly others) do not.

CVE-2018-1334

More information low severity Vulnerable versions: >= 1.0.0, < 2.1.3 Patched version: 2.1.3

In Apache Spark 1.0.0 to 2.1.2, 2.2.0 to 2.2.1, and 2.3.0, when using PySpark or SparkR, it's possible for a different local user to connect to the Spark application and impersonate the user running the Spark application.

CVE-2018-17190

More information low severity Vulnerable versions: >= 0 Patched version: No fix

In all versions of Apache Spark, its standalone resource manager accepts code to execute on a 'master' host, that then runs that code on 'worker' hosts. The master itself does not, by design, execute user code. A specially-crafted request to the master can, however, cause the master to execute code too. Note that this does not affect standalone clusters with authentication enabled. While the master host typically has less outbound access to other resources than a worker, the execution of code on the master is nevertheless unexpected. Mitigation

Enable authentication on any Spark standalone cluster that is not otherwise secured from unwanted access, for example by network-level restrictions. Use spark.authenticate and related security properties described at https://spark.apache.org/docs/latest/security.html

CVE-2017-12612

More information high severity Vulnerable versions: < 2.1.2 Patched version: 2.1.2

In Apache Spark 1.6.0 until 2.1.1, the launcher API performs unsafe deserialization of data received by its socket. This makes applications launched programmatically using the launcher API potentially vulnerable to arbitrary code execution by an attacker with access to any user account on the local machine. It does not affect apps run by spark-submit or spark-shell. The attacker would be able to execute code as the user that ran the Spark application. Users are encouraged to update to version 2.1.2, 2.2.0 or later.

CVE-2017-7678

More information moderate severity Vulnerable versions: < 2.2.0 Patched version: 2.2.0

In Apache Spark before 2.2.0, it is possible for an attacker to take advantage of a user's trust in the server to trick them into visiting a link that points to a shared Spark cluster and submits data including MHTML to the Spark master, or history server. This data, which could contain a script, would then be reflected back to the user and could be evaluated and executed by MS Windows-based clients. It is not an attack on Spark itself, but on the user, who may then execute the script inadvertently when viewing elements of the Spark web UIs.

CVE-2018-11770

More information moderate severity Vulnerable versions: >= 1.0.0, <= 2.3.2 Patched version: No fix

From version 1.3.0 onward, Apache Spark's standalone master exposes a REST API for job submission, in addition to the submission mechanism used by spark-submit. In standalone, the config property 'spark.authenticate.secret' establishes a shared secret for authenticating requests to submit jobs via spark-submit. However, the REST API does not use this or any other authentication mechanism, and this is not adequately documented. In this case, a user would be able to run a driver program without authenticating, but not launch executors, using the REST API. This REST API is also used by Mesos, when set up to run in cluster mode (i.e., when also running MesosClusterDispatcher), for job submission. Future versions of Spark will improve documentation on these points, and prohibit setting 'spark.authenticate.secret' when running the REST APIs, to make this clear. Future 2.4.x versions will also disable the REST API by default in the standalone master by changing the default value of 'spark.master.rest.enabled' to 'false'.

markheger commented 5 years ago

updated to spark 2.3.3