Closed mnodini closed 2 years ago
With latest version of pyspark:
After downgrading to 3.1.2, it runs successfully
Hm interesting - looks like 3.1.2 is already specified in environment.yaml. Any idea how 3.3.0 could have been installed?
environment.yaml
With latest version of pyspark:
After downgrading to 3.1.2, it runs successfully