projectglow / glow

An open-source toolkit for large-scale genomic analysis
https://projectglow.io
Apache License 2.0
266 stars 111 forks source link

PySpark 3.3 support #520

Closed Hoeze closed 8 months ago

Hoeze commented 2 years ago

Hi, does glow already support PySpark 3.3?

williambrandler commented 2 years ago

@a0x8o what would it take to have glow supporting Spark 3.3?

williambrandler commented 1 year ago

@Hoeze are there specific features in Spark 3.3 that you would require?

Hoeze commented 1 year ago

Like always, libraries move to the up-to-date PySpark version and when I want to use the latest features I need all of them on this level.

This time I'd like to try Apache Iceberg with the new BucketBy pushdown to speed up write-once-read-many groupby operations. For that purpose, one needs latest Apache Iceberg + PySpark 3.3.

Hoeze commented 1 year ago

Hi, are there any updates to this?

williambrandler commented 1 year ago

this will require someone at databricks to solve

for now you could make a docker container for glow on spark 3.1 for ingestion into parquet, then another with spark 3.3 and iceberg for downstream groupBy operations. I assume these do not require glow?

Hoeze commented 1 year ago

Hi @williambrandler, thanks for commenting! Docker is not supported on our cluster, instead we have to work with conda environments. This works, but it's a bit annoying to maintain old PySpark versions together with up-to-date ones.

Given that DataBricks Runtime 13.x runs Spark 3.4, I wonder what is the general level of support of this package. Is it still sponsored by DataBricks?

williambrandler commented 1 year ago

when we updated to 3.2 it took a while because spark has evolved rapidly

This introduced incompatibilities that needed to be resolved. It took a few months

As time goes on, folks who worked on glow have moved on to bigger things. And the maintenance of the project requires folks at databricks

Further upgrades to glow likely require assistance from databricks engineering, field, go to market and glow contributors. This will require significant coordination and will take months. But in principal it should be possible. I'm optimistic it will get done in time, but there's only so much I can do from the outside

jisqaqov commented 12 months ago

yeah that is annoying

when we updated to 3.2 it took a while because spark has so many engineers working on it evolved quite far away from glow

This introduced incompatibilities that needed to be resolved. We did manage that but it took a few months

As time goes on, folks who worked on glow have moved on to bigger things. And the maintenance of the project requires folks at databricks, who have conflicting priorities

Further upgrades to glow likely require assistance from databricks engineering, field, go to market and glow contributors. This will require significant coordination and will take months. But in principal it should be possible. I'm optimistic it will get done in time, but there's only so much I can do from the outside

Hey [williambrandler]. This library is fantastic. We would appreciate if someone from databricks finds time to upgrade it to latest spark versions.