Closed DreadPirateShawn closed 3 years ago
Interesting, one possible workaround seems to be resetting the Docker data, e.g.
sudo systemctl stop docker
sudo rm -rf /var/lib/docker
Before the reset, the image still appeared fine and ran fine -- but somehow wiping Docker data and reloading the image allowed Trivy to also scan it successfully.
We're facing the same issue for many repos now. Built from scratch, doesn't matter 😞
Facing same issue
Downgrading to 0.14.0 makes no sense, issue is still reproducing
Doesn't --timeout
option fix the issue?
I forgot to report back but it did help, at least in our case!
This has started happening to us in the past week, since we enabled docker buildkit, and doubling our timeout from 5 to 10 minutes has not fixed it:
[32;1m$ ./trivy \ # collapsed multi-line command[0;m
2021-01-18T20:29:20.751Z [35mDEBUG[0m Severities: HIGH,CRITICAL
2021-01-18T20:29:20.794Z [35mDEBUG[0m cache dir: /builds/.trivycache/
2021-01-18T20:29:20.803Z [35mDEBUG[0m There is no valid metadata file: unable to open a file: open /builds/.trivycache/db/metadata.json: no such file or directory
2021-01-18T20:29:20.803Z [34mINFO[0m Need to update DB
2021-01-18T20:29:20.803Z [34mINFO[0m Downloading DB...
2021-01-18T20:29:20.803Z [35mDEBUG[0m no metadata file
2021-01-18T20:29:21.233Z [35mDEBUG[0m release name: v1-2021011812
2021-01-18T20:29:21.233Z [35mDEBUG[0m asset name: trivy-light-offline.db.tgz
2021-01-18T20:29:21.234Z [35mDEBUG[0m file name doesn't match
2021-01-18T20:29:21.234Z [35mDEBUG[0m asset name: trivy-light.db.gz
2021-01-18T20:29:21.234Z [35mDEBUG[0m file name doesn't match
2021-01-18T20:29:21.234Z [35mDEBUG[0m asset name: trivy-offline.db.tgz
2021-01-18T20:29:21.234Z [35mDEBUG[0m file name doesn't match
2021-01-18T20:29:21.234Z [35mDEBUG[0m asset name: trivy.db.gz
2021-01-18T20:29:21.253Z [35mDEBUG[0m asset URL: https://github-production-release-asset-2e65be.s3.amazonaws.com/216830441/d2af6100-598a-11eb-8d31-4e86105d570d?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20210118%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20210118T202814Z&X-Amz-Expires=300&X-Amz-Signature=7089ad825c43e0f969c9f919e02b610b0accfbd3d904ef4b0449685803dadaed&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=216830441&response-content-disposition=attachment%3B%20filename%3Dtrivy.db.gz&response-content-type=application%2Foctet-stream
2021-01-18T20:29:22.327Z [35mDEBUG[0m Updating database metadata...
2021-01-18T20:29:22.328Z [35mDEBUG[0m DB Schema: 1, Type: 1, UpdatedAt: 2021-01-18 12:40:05.733050483 +0000 UTC, NextUpdate: 2021-01-19 00:40:05.733050083 +0000 UTC, DownloadedAt: 2021-01-18 20:29:22.328532436 +0000 UTC
2021-01-18T20:29:22.332Z [35mDEBUG[0m Vulnerability type: [os library]
2021-01-18T20:39:22.534Z [31mFATAL[0m error in image scan:
github.com/aquasecurity/trivy/internal/artifact.run
/home/circleci/project/internal/artifact/run.go:88
- failed analysis:
github.com/aquasecurity/trivy/pkg/scanner.Scanner.ScanArtifact
/home/circleci/project/pkg/scanner/scan.go:97
- analyze error:
github.com/aquasecurity/fanal/artifact/image.Artifact.Inspect
/go/pkg/mod/github.com/aquasecurity/fanal@v0.0.0-20201218050947-981a0510f9cb/artifact/image/image.go:50
- timeout:
github.com/aquasecurity/fanal/artifact/image.Artifact.inspect
/go/pkg/mod/github.com/aquasecurity/fanal@v0.0.0-20201218050947-981a0510f9cb/artifact/image/image.go:90
- context deadline exceeded
Disabling buildkit does allow it to pass... but that is not ideal
It works in my environment even with BuildKit. It looks like analyzing each layer takes a while. What about 15 or 20 minutes? Also, how large is your image?
@knqyf263 Thanks for the response. I cannot reproduce it locally, just occurring on gitlab, so have disabled buildkit for now..
For what it's worth, this tends to reproduce differently over time.
I'm running scans on about 100 images. The first time I do this, on a clean slate with no images present locally, they all work. 100 "docker pull", 100 "trivy image" scans, all scans function as expected.
Then a week later, I update my image inventory -- some changes, not many -- but when I attempt to re-run the same scan, images that worked a week ago now return the error reported above.
Image sizes vary from... let's say 300mb to 1.2gb, ballpark? And many layers are shared between images, to some extend.
It depends on various loads such as CPU and memory. When you don't have images locally, it also depends on the network situation. Please let me know even if you increase timeout and still face the same issue.
Increased the default timeout. https://github.com/aquasecurity/trivy/pull/842
Just in case google brings someone here, here's how I solved:
Call trivy using the --timeout
argument.
Example: --timeout 10m
for 10 minutes (currently the default is 5 minutes).
im facing the same issue now... the --timeout flag doesnt do anything when you are using trivy as client / server.... on the server you cant define a timeout... and when defining it on the client - it just ignores it...
also i dont think its a timeout issue, when i try to scan the same image several times - it works... the first time, it always fails. thats a very strange behavior
@Alexc0007 Had the same issue as well, but even though I'm also using client / server mode, the --timeout 10m
flag did have an impact on my side (v0.28.1), since it returned successful after 8 Minutes, while it terminated after 5 Minutes on previous runs without using the flag. The fact that it works after several runs with the same setting may be related to caching behaviour, since trivy seems to cache results for each docker layer individually. If you have say 2 layers, each of which takes 4 minutes scan time, the run potentially fails for the first time after exceeding 5m timeout, but may succeed on a subsequent run, since layer #1 has been already successfully cached on the first run.
2022-06-21T13:02:20.182+0530 ^[[35mDEBUG^[[0m Severities: UNKNOWN,LOW,MEDIUM,HIGH,CRITICAL 2022-06-21T13:02:20.186+0530 ^[[35mDEBUG^[[0m cache dir: /var/lib/jenkins/.cache/trivy 2022-06-21T13:02:20.186+0530 ^[[35mDEBUG^[[0m DB update was skipped because DB was downloaded during the last hour 2022-06-21T13:02:20.186+0530 ^[[35mDEBUG^[[0m DB Schema: 1, Type: 1, UpdatedAt: 2022-06-21 00:51:00.975622999 +0000 UTC, NextUpdate: 2022-06-21 06:51:00.975622199 +0000 UTC, DownloadedAt: 2022-06-21 06:45:46.635277088 +0000 UTC 2022-06-21T13:02:20.186+0530 ^[[35mDEBUG^[[0m Vulnerability type: [os library] 2022-06-21T13:02:20.191+0530 ^[[35mDEBUG^[[0m Image ID: sha256:9b0a3304f96138063d896726ff847f37899a3dd78066723ea95dbffa6e47f893 2022-06-21T13:02:20.191+0530 ^[[35mDEBUG^[[0m Diff IDs: [sha256:ad6562704f3759fb50f0d3de5f80a38f65a85e709b77fd24491253990f30b6be sha256:0ebab6089672f2a83ce623bfcdaecd0038f10931862ca08e3675321005937ec5 sha256:e0b4d35690c20fa56f6246d562be0c33b3ca9558f7a8bd8b916ead3a01ac9fe9 sha256:ab8cf9add6ab0bda8d9d4c7b87ec322ecd65d6e57604dc3d1aa75dc8ed584e5b] 2022-06-21T13:02:20.193+0530 ^[[35mDEBUG^[[0m Missing image ID: sha256:9b0a3304f96138063d896726ff847f37899a3dd78066723ea95dbffa6e47f893 2022-06-21T13:02:20.194+0530 ^[[35mDEBUG^[[0m Missing diff ID: sha256:ab8cf9add6ab0bda8d9d4c7b87ec322ecd65d6e57604dc3d1aa75dc8ed584e5b 2022-06-21T13:02:22.592+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse etc/ca-certificates/update.d/docker-openjdk: failed to parse etc/ca-certificates/update.d/docker-openjdk: unrecognized executable format 2022-06-21T13:02:22.595+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.compiler/ASSEMBLY_EXCEPTION: failed to parse usr/local/openjdk-11/legal/java.compiler/ASSEMBLY_EXCEPTION: EOF 2022-06-21T13:02:22.596+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.compiler/ADDITIONAL_LICENSE_INFO: failed to parse usr/local/openjdk-11/legal/java.compiler/ADDITIONAL_LICENSE_INFO: EOF 2022-06-21T13:02:22.596+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.datatransfer/ADDITIONAL_LICENSE_INFO: failed to parse usr/local/openjdk-11/legal/java.datatransfer/ADDITIONAL_LICENSE_INFO: EOF 2022-06-21T13:02:22.596+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.compiler/LICENSE: failed to parse usr/local/openjdk-11/legal/java.compiler/LICENSE: EOF 2022-06-21T13:02:22.596+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.datatransfer/LICENSE: failed to parse usr/local/openjdk-11/legal/java.datatransfer/LICENSE: EOF 2022-06-21T13:02:22.596+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.desktop/ADDITIONAL_LICENSE_INFO: failed to parse usr/local/openjdk-11/legal/java.desktop/ADDITIONAL_LICENSE_INFO: EOF 2022-06-21T13:02:22.596+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.datatransfer/ASSEMBLY_EXCEPTION: failed to parse usr/local/openjdk-11/legal/java.datatransfer/ASSEMBLY_EXCEPTION: EOF 2022-06-21T13:02:22.596+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.desktop/ASSEMBLY_EXCEPTION: failed to parse usr/local/openjdk-11/legal/java.desktop/ASSEMBLY_EXCEPTION: EOF 2022-06-21T13:02:22.596+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.desktop/LICENSE: failed to parse usr/local/openjdk-11/legal/java.desktop/LICENSE: EOF 2022-06-21T13:02:22.597+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.logging/ADDITIONAL_LICENSE_INFO: failed to parse usr/local/openjdk-11/legal/java.logging/ADDITIONAL_LICENSE_INFO: EOF 2022-06-21T13:02:22.597+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.logging/ASSEMBLY_EXCEPTION: failed to parse usr/local/openjdk-11/legal/java.logging/ASSEMBLY_EXCEPTION: EOF 2022-06-21T13:02:22.597+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.instrument/ASSEMBLY_EXCEPTION: failed to parse usr/local/openjdk-11/legal/java.instrument/ASSEMBLY_EXCEPTION: EOF 2022-06-21T13:02:22.597+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.instrument/ADDITIONAL_LICENSE_INFO: failed to parse usr/local/openjdk-11/legal/java.instrument/ADDITIONAL_LICENSE_INFO: EOF 2022-06-21T13:02:22.597+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.management/ADDITIONAL_LICENSE_INFO: failed to parse usr/local/openjdk-11/legal/java.management/ADDITIONAL_LICENSE_INFO: EOF 2022-06-21T13:02:22.597+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.instrument/LICENSE: failed to parse usr/local/openjdk-11/legal/java.instrument/LICENSE: EOF 2022-06-21T13:02:22.598+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.logging/LICENSE: failed to parse usr/local/openjdk-11/legal/java.logging/LICENSE: EOF 2022-06-21T13:02:22.598+0530 ^[[35mDEBUG^[[0m Analysis error: unable to parse usr/local/openjdk-11/legal/java.management/ASSEMBLY_EXCEPTION: failed to parse usr/local/openjdk-11/legal/java.management/ASSEMBLY_EXCEPTION: EOF
VERSION: 0.18.3
I'm struggling with the same issue. Running Trivy with --debug
on a Debian-based image yields:
2024-01-19T09:55:25.922Z DEBUG Base Layers: [sha256:aa904f36746c93c02f6a8274517544fad89782f989cc95e46d3cd8b4977dbdf8]
2024-01-19T09:55:25.922Z DEBUG Missing image ID in cache: sha256:cf0389f5baf0c917b9aad2921c01b364f3f01c1d8db7a7cf01fce67e424a8e6b
2024-01-19T09:55:25.922Z DEBUG Missing diff ID in cache: sha256:5bb1de08f5af5c52be6cfcc1f04a43e58dae316c7568248e58c5aa3cdcb93f27
2024-01-19T09:55:25.922Z DEBUG Missing diff ID in cache: sha256:a876dfc51caafcf3b377e201add8b706b82ffa79064901443c48feebb2a6aeb7
2024-01-19T09:55:25.922Z DEBUG Missing diff ID in cache: sha256:aa904f36746c93c02f6a8274517544fad89782f989cc95e46d3cd8b4977dbdf8
2024-01-19T09:55:25.922Z DEBUG Missing diff ID in cache: sha256:0dfa23fffa4172cf4a9768b4c783b56081013c5ad0a7391070c4d7f55e93ab61
2024-01-19T09:55:25.922Z DEBUG Missing diff ID in cache: sha256:11f55509a8af38f0b1151af8f220f3db1d0c8c3792ac9b67b59c88b65864c58b
2024-01-19T09:55:26.146Z DEBUG Skipping directory: dev
2024-01-19T09:55:26.148Z DEBUG Missing diff ID in cache: sha256:70dab0205baf825509d536d32386cccbd625a9078ac5d8dc96dfba52d0788018
2024-01-19T09:55:26.548Z DEBUG Skipping directory: proc
2024-01-19T09:55:26.548Z DEBUG Skipping directory: sys
2024-01-19T09:55:30.549Z DEBUG Missing diff ID in cache: sha256:8b9310b8d9d1ee4bc5d0e3a43baa5c16dab031cc09753eece44b3a4ceecb18cb
2024-01-19T09:55:31.652Z DEBUG Missing diff ID in cache: sha256:830c570a8997b9ff2891d4618564fee176f4692359ea912b187e5b74b07b7e16
2024-01-19T09:55:32.246Z DEBUG Missing diff ID in cache: sha256:a3a5aea6eaf41e9c56e753bdce98768630541c613383e454bb4479b47fe234bf
2024-01-19T09:55:32.845Z DEBUG Missing diff ID in cache: sha256:2c6f6a723617e0d94a394ce2f635ecc3c8b8f646ff3f4c417fd4886f936d3cbf
2024-01-19T09:56:50.249Z DEBUG Missing diff ID in cache: sha256:55b1608c3799ca9b3143f21d7025dd67130cab2e44a1fd34d4fac297f309d44b
2024-01-19T09:57:12.247Z DEBUG Unable to parse "var/lib/dpkg/available" file: file open error: open var/lib/dpkg/available: file does not exist
2024-01-19T09:57:13.445Z DEBUG Missing diff ID in cache: sha256:1ca4912ebaf51b91cfc757049654f2f6645be7c0a20237aeb585dcb8220842f4
2024-01-19T09:59:48.047Z DEBUG Unable to parse "var/lib/dpkg/available" file: file open error: open var/lib/dpkg/available: file does not exist
The "Unable to parse" messages seem similar to what @adapasuresh has been seeing. Interestingly, the file Trivy complains about is actually present in all image layers! (I based my Dockerfile on buildpack-deps:bookworm
and double-checked.) Something is definitely off here…
Description
Scanning some images consistently results in timeout and "context deadline exceeded" failure, seemingly with no pattern.
That is -- for a given image
docker.prod.skytap.com:5000/skytap/continuous_queries:*
, scanningdocker.prod.skytap.com:5000/skytap/continuous_queries:b835ff37ac1b8304d57060e12b917561387dd2f0
results in timeout / context deadline exceeded, but scanning the prior tag (barely any difference)docker.prod.skytap.com:5000/skytap/continuous_queries:4f9ee82b3f59792480212d18a4ea283b62c87879
will succeed just fine.Both images have been
docker pull
ed beforehand.What did you expect to happen?
Trivy should successfully scan the image, or give a clearer actionable reason for failure.
What happened instead?
Trivy timed out, reported "context deadline exceeded", with seemingly no rhyme or reason.
Output of run with
-debug
:Output of
trivy -v
:Additional details (base image name, container registry info...):
Note that other
context deadline exceeded
tickets exist, but none of them match the specific details of this error.