astrolabsoftware / spark3D

Spark extension for processing large-scale 3D data sets: Astrophysics, High Energy Physics, Meteorology, …
https://astrolabsoftware.github.io/spark3D/
Apache License 2.0
30 stars 16 forks source link

Compiling against Apache Spark 2.3.2: fix incompatibilities #107

Closed JulienPeloton closed 5 years ago

JulienPeloton commented 5 years ago

What this PR brings?

Recent version of Spark changed the input type (Iterator -> Iterable) for ordering.leastOf. The code now reads (utils/Utils.scala):

private def takeOrdered[T](input: Iterator[T], num: Int)(implicit ord: Ordering[T]): Iterator[T] = {
    val ordering = new GuavaOrdering[T] {
      override def compare(l: T, r: T): Int = ord.compare(l, r)
    }
    ordering.leastOf(input.toIterable.asJava, num).iterator.asScala
  }

How this has been tested?

Unit tests pass.

JulienPeloton commented 5 years ago

Oh, and the Spark version in the build.sbt has been updated: 2.1.0 -> 2.3.2.

codecov-io commented 5 years ago

Codecov Report

Merging #107 into master will not change coverage. The diff coverage is 100%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #107   +/-   ##
=======================================
  Coverage   96.37%   96.37%           
=======================================
  Files          32       32           
  Lines        1240     1240           
  Branches      218      218           
=======================================
  Hits         1195     1195           
  Misses         45       45
Flag Coverage Δ
#python 94.3% <ø> (ø) :arrow_up:
#scala 97.24% <100%> (ø) :arrow_up:
Impacted Files Coverage Δ
src/main/scala/com/spark3d/utils/Utils.scala 97.36% <100%> (ø) :arrow_up:

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 371d015...72bc363. Read the comment docs.