Closed thehomebrewnerd closed 1 year ago
I think we are going to be unable to support pyspark with pandas 2.0 until this pyspark issue is closed: https://issues.apache.org/jira/browse/SPARK-43194
For now, I am limiting pandas to <2.0.0
when pyspark is installed to get around this issue.
Merging #1729 (a2429f6) into main (4ffd15a) will decrease coverage by
0.05%
. The diff coverage is71.42%
.
@@ Coverage Diff @@
## main #1729 +/- ##
==========================================
- Coverage 98.81% 98.76% -0.05%
==========================================
Files 98 98
Lines 11936 11942 +6
==========================================
Hits 11794 11794
- Misses 142 148 +6
Impacted Files | Coverage Δ | |
---|---|---|
woodwork/tests/utils/test_utils.py | 98.05% <40.00%> (-1.95%) |
:arrow_down: |
woodwork/tests/accessor/test_column_accessor.py | 100.00% <100.00%> (ø) |
|
woodwork/tests/accessor/test_statistics.py | 100.00% <100.00%> (ø) |
|
woodwork/tests/accessor/test_table_accessor.py | 100.00% <100.00%> (ø) |
|
woodwork/tests/conftest.py | 100.00% <100.00%> (ø) |
|
woodwork/tests/logical_types/test_logical_types.py | 100.00% <100.00%> (ø) |