Closed fpgmaas closed 2 weeks ago
Python 3.7 may be EOL, but Spark 2 is still using in a lot of companies... Do we really need to drop the support of Spark 2? @MrPowers what do you think about it?
Python 3.7 may be EOL, but Spark 2 is still using in a lot of companies... Do we really need to drop the support of Spark 2? @MrPowers what do you think about it?
Valid point. I proposed to drop support for Spark 2 as a consequence of dropping support for Python 3.7, since that seems to be the latest version that supports Spark 2, see here.
I tried to run the tests with Spark 2 and Python 3.8, but that failed.
So I believe that if we want to keep supporting Spark 2 then we should also keep supporting Python 3.7. But this will prevent us from updating some dependencies and it requires us to keep using some workarounds such as this one.
If it's preferred to not drop support for Python 3.7 and Spark 2, let me know! I will try to update the PR accordingly.
Let's keep Python 3.7 & Spark 2 support for now.
It's super annoying, but I know lots of users are on legacy versions. But let's drop soon.
This PR does the following:
docs
dependency group.Let me know your thoughts :)