Open JtMotoX opened 1 year ago
Having the same issue on same self-hosted Linux agent with the same task.
I have done some troubleshooting and may have stumbled upon something. I noticed that one of our agents do not get this error so I started doing some comparison with the other agents. It was running an older version of docker. I upgraded docker to match the version of the other agents and all the sudden it also started to get the error. I downgraded docker back to original and the error went away.
Here are my findings:
Version | status |
---|---|
docker-ce-3:20.10.21-3.el8.x86_64 | fine |
docker-ce-3:23.0.1-1.el8.x86_64 | broken |
docker-ce-3:23.0.3-1.el8.x86_64 | broken |
So far I have not found any issues with downgrading. BuildKit still works fine. Hopefully Microsoft will provide an update to this Docker@2 extension for the newer docker versions.
Here is what I did to "fix" the other agents.
sudo yum remove \
docker-buildx-plugin \
docker-ce \
docker-ce-cli \
docker-ce-rootless-extras
sudo yum install \
docker-ce-3:20.10.9-3.el7.x86_64 \
docker-ce-cli-1:20.10.21-3.el7.x86_64 \
docker-ce-rootless-extras-20.10.21-3.el7.x86_64
Just found another workaround. If you are using the newer version of Docker, just make sure you define the BuildKit env var in the task. The value doesn't seem to matter so you can use 0 or 1 to get rid of the error. It will be a pain to set this env in every task across every project. If anybody has a solution to do this at a global level please let me know. I have tried setting the buildkit feature in the daemon.json on the host and restarted the service but the pipeline still throws the warning.
- task: Docker@2
inputs:
command: 'build'
env:
DOCKER_BUILDKIT: 1
This method does not seem to work consistently.
So far this is the best workaround I have found. This sets the environment variable globally which gets rid of the warning.
Edit the runsvc.sh
file in the agent directory and add the following export, then restart the agent service.
# insert anything to setup env when running as a service
export DOCKER_BUILDKIT=1
This method does not seem to work consistently.
can confirm having the same issue using a VMSS pool using the Ubuntu Minimal LTS 22.04
image canonical:0001-com-ubuntu-minimal-jammy:minimal-22_04-lts-gen2:latest
using this docker version
Version: 23.0.4
API version: 1.42
Go version: go1.19.8
Git commit: f480fb1
Built: Fri Apr 14 10:32:03 2023
OS/Arch: linux/amd64
Context: default
Server: Docker Engine - Community
Engine:
Version: 23.0.4
API version: 1.42 (minimum version 1.12)
Go version: go1.19.8
Git commit: cbce331
Built: Fri Apr 14 10:32:03 2023
OS/Arch: linux/amd64
Experimental: false
containerd:
Version: 1.6.20
GitCommit: 2806fc1057397dbaeefbea0e4e17bddfbd388f38
runc:
Version: 1.1.5
GitCommit: v1.1.5-0-gf19387a
docker-init:
Version: 0.19.0
GitCommit: de40ad0
Just to add to the chorus here: my on-prem agent on a Windows VM throws this warning while building a Linux container. Surprised to see this has been a problem for something like 3 years now...
Similar the the ones above, but this sets it for the entire stage/job in the yaml.
variables:
- name: DOCKER_BUILDKIT
value: 1
Another echoing of the chorus, we just started getting this on Azure Pipelines hosted agents in the past 24-48 hours on Docker@2 tasks. However it's not breaking luckily for us anyway, will attempt to look into some of the workarounds if we get time.
Another echoing of the chorus, we just started getting this on Azure Pipelines hosted agents in the past 24-48 hours on Docker@2 tasks. However it's not breaking luckily for us anyway, will attempt to look into some of the workarounds if we get time.
Same here but fortunately Workaround #4 seems to work - trying it twice the warning didn't show up any longer after having set docker buildkit as suggested above! Hopefully Azure behaves deterministically here and the workaround keeps working.
Same here. One strange thing I've noticed is that agents seem to have started using buildkit without any apparent change on our side (ubuntu-latest Hosted) and even though all agents of the pool have the same agent version running, one of them looks like is still using classic build instead of buildkit.
In case anyone doesn't realize it, BuildKit became the default builder starting in Docker 23. BuildKit acts differently than the old builder, sending no output to stdout. That's what the original ticket was about. The fix for the original ticket was to detect when BuildKit was enabled by looking at the DOCKER_BUILDKIT
environment variable and acting appropriately when it was set. But with Docker 23 BuildKit is enable by default even if you don't have DOCKER_BUILDKIT
defined. But in that case the task doesn't know you are using BuildKit so it expects the old builder behavior. This is why the workaround mentioned above of defining DOCKER_BUILDKIT
works.
We use a self-hosted agent and when I started using Docker 23 on it, I defined DOCKER_BUILDKIT
at the agent level, so it's always defined for all jobs. So I have not run into this error again.
In case anyone doesn't realize it, BuildKit became the default builder starting in Docker 23. BuildKit acts differently than the old builder, sending no output to stdout. That's what the original ticket was about. The fix for the original ticket was to detect when BuildKit was enabled by looking at the
DOCKER_BUILDKIT
environment variable and acting appropriately when it was set. But with Docker 23 BuildKit is enable by default even if you don't haveDOCKER_BUILDKIT
defined. But in that case the task doesn't know you are using BuildKit so it expects the old builder behavior. This is why the workaround mentioned above of definingDOCKER_BUILDKIT
works.We use a self-hosted agent and when I started using Docker 23 on it, I defined
DOCKER_BUILDKIT
at the agent level, so it's always defined for all jobs. So I have not run into this error again.
thanks for sharing this info, I didn't realize it. Great that you highlighted what's going on in the back.
Same issue, I am contantly getting this Warning now with the Docker Task Docker@2
+1
I am still getting this, even on a hosted agent AND with DOCKER_BUILDKIT
set to 1
.
Same here
Same problem, solution Workaround # 4 is no longer working
Any news on this?
Any known workaround that is still working to get rid of these warnings?
All of our pipelines now have and it works to get rid of warnings
variables:
DOCKER_BUILDKIT: 1
trigger:
resources:
variables:
dockerRegistryServiceConnection: '484f7284-2d0d-4a5a-9c89-465005269850' imageRepository: 'avinash123' containerRegistry: 'azure12.azurecr.io' dockerfilePath: '$(Build.SourcesDirectory)/tg-redirect-master/Dockerfile' tag: '$(Build.BuildId)'
vmImageName: 'ubuntu-latest'
stages:
any solutions for above issue
Adding the following is suggested above and fixed it for my builds. However even with this I still get the warning for Export and Tag tasks.
variables:
- name: DOCKER_BUILDKIT
value: 1
same issue here. recommended fix does nothing
The issue seems to be for containers that write nothing to stdout, which is valid. I get this when running trivy in a container in a pipeline since it writes nothing to stdout
with -o
Below is a test program and pipeline. The step that only writes to stderr gets the warning.
if (args.Length == 0 || args[0] == "stdout" || args[0] == "both")
Console.WriteLine("Hello from stdout!");
if (args.Length > 0 && (args[0] == "stderr" || args[0] == "both"))
Console.Error.WriteLine("Hello from stderr!");
trigger: none
pr: none
jobs:
- ${{ each buildKit in split('0,1',',')}}:
- job: test_${{ buildKit }}
displayName: 'Test with BUILDKIT=${{ buildKit }}'
variables:
- ${{ if buildKit }}:
- name: DOCKER_BUILDKIT
value: 1
steps:
- task: Docker@2
displayName: Build ${{ parameters.repository }} Image
inputs:
repository: stdouttest
command: build
Dockerfile: 'tools/stdoutTest/Dockerfile'
buildContext: 'tools/stdoutTest'
tags: latest
- task: Docker@2
displayName: 'Stdout only '
inputs:
command: 'run'
arguments: --rm stdouttest stdout
- task: Docker@2
displayName: 'Stderr only '
inputs:
command: 'run'
arguments: --rm stdouttest stderr # 👈 produces the warning
- task: Docker@2
displayName: 'Both only '
inputs:
command: 'run'
arguments: --rm stdouttest both
This workaround works for me.
- task: Docker@2
inputs:
containerRegistry: ${{ parameters.registry }}
command: 'login'
Use a script task to run the docker
command by hand
- pwsh: |
Set-StrictMode -Version Latest
"##[command]docker run --rm -q ..."
docker run --rm -q ...
I do not get any errant warnings doing docker run
that way
Task name
Docker@2
Task version
2.214.0
Environment type (Please select at least one enviroment where you face this issue)
Azure DevOps Server type
dev.azure.com (formerly visualstudio.com)
Azure DevOps Server Version (if applicable)
No response
Operation system
Debian 11
Issue Description
I am getting this warning:
##[warning]No data was written into the file /home/vsts/work/_temp/task_outputs/build_****.txt
This was originally brought up in https://github.com/microsoft/azure-pipelines-tasks/issues/12470 but the issue was closed without it being resolved. Many users (including myself) are still facing this issue.
Task log
Aditional info