Closed vprihoda closed 1 year ago
@vprihoda Thanks for detailed explanation. Currently i am working on it, We can expect new release in few weeks.
@vprihoda Thanks for detailed explanation. Currently i am working on it, We can expect new release in few weeks.
Hi, @rumeshkrish is there any progress, can we expect release in the near future?
@vprihoda @rumeshkrish does anyone have a workaround for this please?
Please check the new [release V2.0] (https://github.com/awslabs/amazon-s3-tagging-spark-util/releases/tag/v2.0) page and README.md. Thanks for your patience.!
Closing this issue.
i'm still hitting this same error using the amazon-s3-tagging-spark-util-spark_33-scala-2.12-lib-2.0.jar with Glue 4
Hello, I have tried the library which is built with Scala 2.12
amazon-s3-tagging-spark-util-assembly_2.12-1.0.jar
on pyspark glue 3 job. The job reads parquets from given path and it should write the parquets with given tags.The jobs fails with expection
The specified key does not exist
.I was able to identified that it probably fails when it tries to access the temporary file.
Could you please provide us support with this issue?