Closed JulienPeloton closed 5 years ago
Merging #99 into master will increase coverage by
0.17%
. The diff coverage isn/a
.
@@ Coverage Diff @@
## master #99 +/- ##
==========================================
+ Coverage 96.19% 96.37% +0.17%
==========================================
Files 32 32
Lines 1210 1240 +30
Branches 217 217
==========================================
+ Hits 1164 1195 +31
+ Misses 46 45 -1
Flag | Coverage Δ | |
---|---|---|
#python | 94.3% <ø> (+0.79%) |
:arrow_up: |
#scala | 97.24% <ø> (ø) |
:arrow_up: |
Impacted Files | Coverage Δ | |
---|---|---|
pyspark3d_conf.py | 88.23% <0%> (ø) |
:arrow_up: |
spatialOperator.py | 100% <0%> (ø) |
:arrow_up: |
converters.py | 100% <0%> (ø) |
:arrow_up: |
spatial3DRDD.py | 100% <0%> (ø) |
:arrow_up: |
__init__.py | 98.71% <0%> (+2.16%) |
:arrow_up: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update d04876c...2dde0a3. Read the comment docs.
There was some confusion on when and why using
get_spark_session
andload_user_conf
. These routines are mainly used in 2 contexts:In those two cases, you need to load the JAR+deps within the session.
However, in a regular pyspark session, or in batch mode, you should not use it. Instead, link spark3D when starting the session:
If you installed pyspark3d via pip, the JAR is released with the python source files. To find its location:
The doctests have been simplified by using
extraglobs
.