Azure / cli

Automate your GitHub workflows using Azure CLI scripts
MIT License
125 stars 54 forks source link

Error: Unable to find image 'mcr.microsoft.com/azure-cli:latest' locally #150

Closed Kaloszer closed 3 months ago

Kaloszer commented 4 months ago

Since this week I've started seeing this issue where it had never happened before. It's causing intermittent pipeline failures - which can be remediated by re-running the job - but it's annoying :)

It's happened about 6 times this week, today 3 times within a single pipeline matrix execution.


Run azure/CLI@v2
  with:
    azcliversion: latest
    inlineScript: az appconfig kv export --auth-mode login --destination file --endpoint https://endpoint.azconfig.io --path ./Infrastructure/config/dynamicConfig.json --format json --name endpoint --label 'label' --yes
  az appconfig kv export --auth-mode login --destination file --endpoint https://endpoint.azconfig.io --path ./Infrastructure/config/staticConfig.json --format json --name endpoint --label shared --yes

Warning: Unable to fetch all az cli versions, please report it as an issue on https://github.com/Azure/CLI/issues. Output: Ref A: 376D80C1D1A[2](<redacted>#step:5:2)4CD2ABD4[3](<redacted>#step:5:3)CB8ED9D3162 Ref B: DM2EDGE0916 Ref C: 202[4](<redacted>#step:5:4)-05-10T09:21:56Z
, Error: SyntaxError: Unexpected token 'R', "Ref A: 376"... is not valid JSON
Starting script execution via docker image mcr.microsoft.com/azure-cli:latest
Error: Error: Unable to find image 'mcr.microsoft.com/azure-cli:latest' locally
docker: Error response from daemon: error parsing HTTP 429 response body: invalid character 'R' looking for beginning of value: "Ref A: 6132F28126274E3B86A1BB7C98066774 Ref B: DM2EDGE0618 Ref C: 2024-0[5](<redacted>#step:5:5)-10T09:21:56Z".
See 'docker run --help'.

cleaning up container...
Warning: Error response from daemon: No such container: MICROSOFT_AZURE_CLI_1715332916356_CONTAINER

Error: Unable to find image 'mcr.microsoft.com/azure-cli:latest' locally
docker: Error response from daemon: error parsing HTTP 429 response body: invalid character 'R' looking for beginning of value: "Ref A: [6](<redacted>#step:5:6)132F281262[7](h<redacted>#step:5:8)4E3B86A1BB7C98066774 Ref B: DM2EDGE0618 Ref C: 2024-05-10T09:21:56Z".
See 'docker run --help'.
kantkrishan commented 4 months ago

we are also getting this error couple of times today intermittently. Can someone please take a look?

Dariusz-Jalowiec commented 4 months ago

I started seeing this error today in GitHub actions workflow using public runners:

Run azure/cli@v2
Warning: Unable to fetch all az cli versions, please report it as an issue on https://github.com/Azure/CLI/issues. Output: Ref A: 6F2FE8BB195A4129B48C91[7](https://github.com/xxxx/xxx/actions/runs/9026266002/job/24823069885#step:5:8)C89A2D77D Ref B: DM2EDGE0518 Ref C: 2024-05-10T13:49:32Z
, Error: SyntaxError: Unexpected token 'R', "Ref A: 6F2"... is not valid JSON
Starting script execution via docker image mcr.microsoft.com/azure-cli:2.59.0
Error: Error: Unable to find image 'mcr.microsoft.com/azure-cli:2.59.0' locally
docker: Error response from daemon: error parsing HTTP 429 response body: invalid character 'R' looking for beginning of value: "Ref A: 951F0CF444E74015BC9FBCE4B610ACF9 Ref B: DM2EDGE0912 Ref C: 2024-05-10T13:49:32Z".
See 'docker run --help'.

cleaning up container...
Warning: Error response from daemon: No such container: MICROSOFT_AZURE_CLI_171534[8](https://github.com/xxxx/xxxxx/actions/runs/9026266002/job/24823069885#step:5:9)971894_CONTAINER

Error: Unable to find image 'mcr.microsoft.com/azure-cli:2.5[9](https://github.com/xxxx/xxx/actions/runs/9026266002/job/24823069885#step:5:10).0' locally
docker: Error response from daemon: error parsing HTTP 429 response body: invalid character 'R' looking for beginning of value: "Ref A: 951F0CF444E74015BC9FBCE4B6[10](https://github.com/xxxx/xxxxx/actions/runs/9026266002/job/24823069885#step:5:11)ACF9 Ref B: DM2EDGE0912 Ref C: 2024-05-10T13:49:32Z".
See 'docker run --help'.

if i rerun the workflow is mostly works, strange part other jobs which also use azure cli might work in same workflow while others do not.

MoChilia commented 4 months ago

Hi @Kaloszer, @kantkrishan, @Dariusz-Jalowiec, may I know the configuration of your runner? Are you using a GitHub-hosted runner or a self-hosted one?

kantkrishan commented 4 months ago

github hosted runner for me. runs-on: ubuntu-latest

Kaloszer commented 4 months ago

github hosted runner for me. runs-on: ubuntu-latest

Same here

MoChilia commented 4 months ago

It appears to be an internet problem on GitHub runners. To check, you can add this step before azure/cli@v2 to see if it fails.

  - run: |
          curl --location -s https://mcr.microsoft.com/v2/azure-cli/tags/list
Kaloszer commented 4 months ago

It appears to be an internet problem on GitHub runners. To check, you can add this step before azure/cli@v2 to see if it fails.

  - run: |
          curl --location -s https://mcr.microsoft.com/v2/azure-cli/tags/list

Well we can't fix your DNS on github runners, now can we 🤣? Seems intermittent, lets hope it was just last week.

kantkrishan commented 4 months ago

I am continuing to experience the issue as previously reported. In alignment with your suggestions for troubleshooting, we attempted to fetch the tags list for azure-cli using the provided command. Unfortunately, the responses received suggest a possible service error or an unexpected redirection mechanism rather than the anticipated JSON listing of tags.

Examples of the issue encountered:

Attempt 1 Command: curl --location -s https://mcr.microsoft.com/v2/azure-cli/tags/list Response: Ref A: 326F84314BE141D6BF0EE8BB0324FE2D Ref B: DM2EDGE0512 Ref C: 2024-05-13T09:14:46Z

Attempt 2 Command: curl --location -s https://mcr.microsoft.com/v2/azure-cli/tags/list Response: Ref A: CDE887B546814BFEBB4A481249AE7212 Ref B: DM2EDGE0512 Ref C: 2024-05-13T09:14:46Z

Error screenshot: image

I have observed that this error frequently occurs when attempting to run approximately 20 jobs in parallel. This leads me to suspect that the issue could be related to rate limiting or another similar constraint.

I am eager to resolve this issue and would appreciate any additional insights or suggestions you might have. I am available to provide further details or engage in additional troubleshooting steps as necessary.

Also if you can prioritise this enhancement, we can leverage our own artifactory repository to store this image.

Thank you for your assistance.

kantkrishan commented 4 months ago

@Kaloszer @Dariusz-Jalowiec Are you guys still having this issue?

I ran around 100 jobs in parallel but not seeing issue today. something might have changed recently it seems.

Kaloszer commented 4 months ago

@Kaloszer @Dariusz-Jalowiec Are you guys still having this issue?

I ran around 100 jobs in parallel but not seeing issue today. something might have changed recently it seems.

Seems okay, my 8 went with no issues

MoChilia commented 4 months ago

Hi everyone, let's keep an eye on the situation this week. If the problem persists, we can open an issue with the GitHub Actions runners team to request a fix.

Dariusz-Jalowiec commented 4 months ago

@Kaloszer @Dariusz-Jalowiec Are you guys still having this issue?

I ran around 100 jobs in parallel but not seeing issue today. something might have changed recently it seems.

seems ok for me too

MoChilia commented 3 months ago

Closing the issue for now, as the problem has not reoccurred.