vemonet / setup-spark

:octocat:✨ Setup Apache Spark in GitHub Action workflows
https://github.com/marketplace/actions/setup-apache-spark
MIT License
20 stars 12 forks source link

Run vemonet/setup-spark@v1 started failing from yesterday #19

Closed pullepuvinay closed 1 year ago

pullepuvinay commented 1 year ago

Describe the bug A clear and concise description of what the bug is. Run vemonet/setup-spark@v1 started failing since yesterday, not sure tried checking multiple versions but still the same. Would you mind helping to find some resolution here? The download of the jar file is taking too long, which is resulting in an Error: The operation was cancelled
which is happening during the git-hub build.

Which version of the action are you using?

v1, main, other tag? vemonet/setup-spark@v1

Environment GitHub-hosted, Self-hosted?

GitHub-hosted

Spark Versions Please list all of the effected versions of Spark (3.1.2, etc.)

      spark-version: '3.1.2'
      hadoop-version: '3.2' 

To Reproduce Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Run/Repo Url If applicable, and if your repo/run is public, please include a URL so it is easier for us to investigate.

Screenshots If applicable, add screenshots to help explain your problem.

Screenshot 2023-05-17 at 10 49 33 AM

Additional context Add any other context about the problem here.

pullepuvinay commented 1 year ago

@vemonet updated the file with spark-URL and with spark-version '3.2.4' checks started working fine.

 spark-version: '3.2.4'
          hadoop-version: '3.2'
          spark-url: 'https://dlcdn.apache.org/spark/spark-3.2.4/spark-3.2.4-bin-hadoop3.2.tgz'
vemonet commented 1 year ago

Hi @pullepuvinay I did not managed to reproduce the issue, we just released a new version, maybe it is already fixed?

Could you try it again?

benjamincitrin commented 1 year ago

@vemonet I think we can close this issue since all functional tests are passing

vemonet commented 1 year ago

Indeed, feel free to reopen it or open a new issue if you still encounter the problem @pullepuvinay