Closed johnnylarner closed 1 year ago
Hi @johnnylarner , thanks a lot for the report and the debugging! I can't seems to access your fork, could you send a pull request so that I test and integrate your changes?
Hi @vemonet, take a look at the PR at let me know if you have any issues.
Sorry for the delay, I merged it thanks!
Describe the bug When Github reuses a runner to execute the action, the spark installation fails due a symbolic link error. See here:
Note that when a "new" runner is used, the installation works fine. See how I distinguish between "old" and "new" runners in the To Reproduce section below.
I forked this repository and solved my issue by by changing the flags in line 67 of dist/.js from -s to -sf. As per the man page, this removes existing links before creating a new one.
ln -sf "${installFolder}/spark-${sparkVersion}-bin-hadoop${hadoopVersion}${scalaBit}" ${installFolder}/spark;
If you're on board, I'd create a PR with this change.
Which version of the action are you using?
v1
Environment Self-hosted, linux
Spark Versions
To Reproduce Steps to reproduce the behavior:
pip3 install pyspark pytest
as a commandRun/Repo Url My fork: https://github.com/johnnylarner/setup-spark