HodorNV / ALOps

ALOps
55 stars 24 forks source link

Qst: How to "warm up" Build Agents by pre-downloading weekly container images and artifacts #723

Closed kasperdj closed 4 months ago

kasperdj commented 4 months ago

Most of our pipelines are running against the weekly DK sandboxes. We would like to pre-download both container images weekly and also the artifacts for the use of the Super Compiler.

We have already prepared the powershell script for the download and automated this on our 10 build agents but it seams like we need to place the downloaded images and artifacts in some other folders to be able to use these in ALOps as we can see pipelines are still downloading images and artifacts during execution.

Please outline which folders to place:

Thx. in advance.

kasperdj commented 4 months ago

What is the status on this issue which was created 2 weeks ago?

waldo1001 commented 4 months ago

ALOps caches automatically (will search for image names): your first build in the week should trigger the download, but all the rest is using the cached one. So if you'd schedule an ALOps step (docker create) on your 10 agents, you should get what you're after:

Otherwise, please share the script so we know better what the intention is.

kasperdj commented 4 months ago

We run the scripts via Task Scheduler on each build server as all build servers are in the same application pool, hence I don't know how to schedule an execution of a pipeline on a set of specific build servers.

Here is a copy of the script used for the Container images:

function Get-ArchiveAndExpand() {
    param (
        $baseUrl,
        $filePath,
        $archiveName,
        [Switch]$suppressInfo
    )
    try {

        $tempFilePath = "$env:TEMP\$($archiveName)-$([System.Guid]::NewGuid())"
        if (!(Test-Path $tempFilePath)) {
            New-Item -ItemType Directory -Path $tempFilePath | Out-Null
        }

        if (!$suppressInfo) {
            Write-Host -ForegroundColor Green "Downloading Weekly $($archiveName)"
        }
        Invoke-WebRequest -UseBasicParsing -Uri "$($baseUrl)$($archiveName)" -OutFile (Join-Path $tempFilePath "$($archiveName).zip")

        if (!$suppressInfo) {
            Write-Host -ForegroundColor Green "Expand Weekly $($archiveName)"
        }  Expand-Archive -Path (Join-Path $tempFilePath "$($archiveName).zip") -DestinationPath (Join-Path $filePath $archiveName) -Force
        try {
            [datetime]::UtcNow.Ticks | Set-Content -Path (Join-Path $filePath "$archiveName/lastused")
        }
        catch {

        }
        Remove-Item -Path $tempFilePath -Force -Recurse
    }
    catch {
        Write-Host -ForegroundColor Red $_
    }
}

function Get-WeeklyBCArtifactCache() {
    param (
        $archiveNames,
        [Switch]$suppressInfo
    )
    $CurrentProgressPreference = $ProgressPreference
    $ProgressPreference = 'SilentlyContinue'
    if (!$suppressInfo) {
        Write-Host -ForegroundColor Green "Getting Weekly Sandbox Url"
    }
    [uri]$artifactUri = Get-BCArtifactUrl -select Weekly -type Sandbox
    $segments = $artifactUri.LocalPath.Split('/') 
    $baseUrl = "$($artifactUri.Scheme)://$($artifactUri.Host)/$($segments.Get(1))/$($segments.Get(2))/"
    $filePath = "C:\bcartifacts.cache\$($segments.Get(1))\$($segments.Get(2))\"

    if (!$suppressInfo) {
        Write-Host
        Write-Host -ForegroundColor Green "Weekly Version: $($segments.Get(2))"
        Write-Host
    }
    $archiveNames | ForEach-Object {
        Get-ArchiveAndExpand -baseUrl $baseUrl -filePath $filePath -archiveName $_
    }
    $ProgressPreference = $CurrentProgressPreference
}
Get-WeeklyBCArtifactCache -archiveNames 'platform', 'w1', 'dk'
waldo1001 commented 4 months ago

2 suggestions:

  1. You could be using a matrix in yaml (devops) to run on specific agents.
  2. could you try the Download-Artifacts (default bcch) to download the artifact in your script? This will use the default BCCH comdlets, naming conventions - hence it might work like that as well.
kasperdj commented 4 months ago

I will engage with our internal IT in regards to your 2nd suggestion, thx