actions / setup-python

Set up your GitHub Actions workflow with a specific version of Python
MIT License
1.74k stars 549 forks source link

Post Run fails when using "cache: pip" on v5.2.0 #932

Closed clovisp closed 1 month ago

clovisp commented 2 months ago

Description: The latest release makes the Post Run actions/setup-python fail systematically with the error message :

Post job cleanup.
Error: Cache folder path is retrieved for pip but doesn't exist on disk: /home/runner/.cache/pip

We use the following configuration in our workflows:

      - name: Setup Python 3.11
        uses: actions/setup-python@v5
        with:
          python-version: 3.11
          architecture: x64
          cache: "pip"

Forcing the use of previous version actions/setup-python@v5.1.1 solves the issue.

Action version: 5.2.0

Platform:

Runner type:

Tools version: Python 3.11

Repro steps:
Executing any action with the following step:

      - name: Setup Python 3.11
        uses: actions/setup-python@v5
        with:
          python-version: 3.11
          architecture: x64
          cache: "pip"

Expected behavior: The post run succeeds

Actual behavior: The post run fails:

image
petemounce commented 2 months ago

I see this within my own workflows exactly as reported. I used the same pin-to-5.1.1 workaround (but removing the cache: pip worked too). Related to https://github.com/actions/setup-python/issues/436?

mahabaleshwars commented 2 months ago

Hello @clovisp, Thank you for creating this issue. We will investigate it and provide feedback as soon as we have some updates.

aparnajyothi-y commented 2 months ago

Hello @clovisp, Thank you for creating this issue once again. This is expected behavior because you are using cache: pip without installing any dependencies. We were able to reproduce the issue on versions 5.2.0, 5.1.1, and 4 without adding the pip installation because pip hasn't cached any packages yet, so the cache directory doesn't exist.

The setup-python action sets up a Python environment, but it does not create a pip cache directory by default. The pip cache directory is created when pip installs a package and needs to cache it.

We tested the same scenario by adding the pip installation job, and the pip cache file was generated in the /home/runner/.cache/pip location, caching the installed packages.

To fix the issue, please consider the following possibilities: 1.Install dependencies: Make sure all required dependencies are installed and committed. 2.Do not use cache: pip: If there are no dependencies to cache, remove the cache: pip configuration from your GitHub Actions workflow.

If the issue stills continues from your end, Can you please share the workflow runs of v5.1.1 and v5.2.0 to check and assist.

Please find the workflow snippet and screenshots for reference.

testjobv5_2_0:
    name: test cache pip
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5.2.0
        with:
          python-version: 3.11
          architecture: x64
          cache: "pip"

  testjobv_5_1_1:
     name: test cache pip for v5.1.1
     runs-on: ubuntu-latest

     steps:
       - uses: actions/checkout@v4
       - uses: actions/setup-python@v5.1.1
         with:
           python-version: 3.11
           architecture: x64
           cache: "pip"

  testjobv_4:
     name: test cache pip for v4
     runs-on: ubuntu-latest

     steps:
       - uses: actions/checkout@v4
       - uses: actions/setup-python@v4
         with:
           python-version: 3.11
           architecture: x64
           cache: "pip"
Screenshot 2024-08-30 at 11 18 31 AM

To fix the issue

 steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: 3.11
          architecture: x64
          cache: "pip"

      - name: Install dependencies
        run: |
          python -m pip install --upgrade pip
          pip install -r requirements.txt
Screenshot 2024-08-30 at 11 19 37 AM
kamilkrzyskow commented 2 months ago

Thanks @aparnajyothi-y, without your explanation I would likely not find the cause, even though it was dead obvious.

Here is an example workflow run with the issue:

So basically there is a separate venv cache managed with a different cache-key via actions/cache. From various tests, which I've done in the past, this approach decreases the displayed workflow execution time, as downloading and unpacking the cached venv directory is faster than reinstalling various wheel files from the pip cache itself. This can also be observed in the workflow above.

Due to the change in 5.2.0, which added the architecture to the cache-key, the "uses: actions/setup-python" step can't find any previous caches. This is because the arch was added in the middle of the key instead of the end so the "hashless" key doesn't act as the prefix for older keys.

Debug from the above workflow:

##[debug]["setup-python-Linux-x64-22.04-Ubuntu-python-3.12.5-pip-f3ce5f0aeb583ae252b72dd2e91cf415d92a0ca6eeade0b7caebd9ed7fdd3943","setup-python-Linux-x64-22.04-Ubuntu-python-3.12.5-pip"]

example of a previous key:

setup-python-Linux-22.04-Ubuntu-python-3.12.5-pip-f3ce5f0aeb583ae252b72dd2e91cf415d92a0ca6eeade0b7caebd9ed7fdd3943

Some final comments:

EDIT: Forgot to mention, my solution was to delete the latest separate venv cache, and rerun the failing jobs to execute pip install EDIT2: Actually changing the position of the arch inside the cache-key won't change the situation, because the "hashless" key would still not have a matching base prefix with the older cache keys. My mistake with a too quick conclusion.

petemounce commented 2 months ago

I don't know how common is the approach with the separate actions/cache for the venv, but perhaps add it as a question for the Issue template?

@kamilkrzyskow we use uv (speed vs pip), and we also cache the venv separately (also for speed).

aparnajyothi-y commented 1 month ago

Hello @clovisp, @kamilkrzyskow,

We have updated the release notes to include details about the architecture addition in the cache key name. Consequently, we are proceeding to close the issue. Please feel free to reach out if you have any concerns or need further clarifications to reopen the issue.