Closed Hoeze closed 8 months ago
@a0x8o what would it take to have glow supporting Spark 3.3?
@Hoeze are there specific features in Spark 3.3 that you would require?
Like always, libraries move to the up-to-date PySpark version and when I want to use the latest features I need all of them on this level.
This time I'd like to try Apache Iceberg with the new BucketBy pushdown to speed up write-once-read-many groupby operations. For that purpose, one needs latest Apache Iceberg + PySpark 3.3.
Hi, are there any updates to this?
this will require someone at databricks to solve
for now you could make a docker container for glow on spark 3.1 for ingestion into parquet, then another with spark 3.3 and iceberg for downstream groupBy operations. I assume these do not require glow?
Hi @williambrandler, thanks for commenting! Docker is not supported on our cluster, instead we have to work with conda environments. This works, but it's a bit annoying to maintain old PySpark versions together with up-to-date ones.
Given that DataBricks Runtime 13.x runs Spark 3.4, I wonder what is the general level of support of this package. Is it still sponsored by DataBricks?
when we updated to 3.2 it took a while because spark has evolved rapidly
This introduced incompatibilities that needed to be resolved. It took a few months
As time goes on, folks who worked on glow have moved on to bigger things. And the maintenance of the project requires folks at databricks
Further upgrades to glow likely require assistance from databricks engineering, field, go to market and glow contributors. This will require significant coordination and will take months. But in principal it should be possible. I'm optimistic it will get done in time, but there's only so much I can do from the outside
yeah that is annoying
when we updated to 3.2 it took a while because spark has so many engineers working on it evolved quite far away from glow
This introduced incompatibilities that needed to be resolved. We did manage that but it took a few months
As time goes on, folks who worked on glow have moved on to bigger things. And the maintenance of the project requires folks at databricks, who have conflicting priorities
Further upgrades to glow likely require assistance from databricks engineering, field, go to market and glow contributors. This will require significant coordination and will take months. But in principal it should be possible. I'm optimistic it will get done in time, but there's only so much I can do from the outside
Hey [williambrandler]. This library is fantastic. We would appreciate if someone from databricks finds time to upgrade it to latest spark versions.
Hi, does
glow
already support PySpark 3.3?