goharbor / harbor

An open source trusted cloud native registry project that stores, signs, and scans content.
https://goharbor.io
Apache License 2.0
23.89k stars 4.74k forks source link

Inconsistent Behavior When Triggering Trivy Scan via Harbor API #20952

Open kon-foo opened 2 weeks ago

kon-foo commented 2 weeks ago

Expected behavior and actual behavior: Expected behavior: When sending a request to initialize a Trivy scan of an artifact through the Harbor API, I expect the scan to consistently either succeed or fail, as the configuration of the artifact and Trivy scanner do not change between requests.

Actual behavior: Some attempts succeed, while others return a 400 Bad Request error with the following message:

The configured scanner Trivy does not support scanning artifact with mime type application/vnd.docker.distribution.manifest.v2+json

While the inconsistency might be an issue with our configuration and environment, at least the error message cant be correct. Screenshots showing both successful and failed requests are attached to this issue.

Steps to reproduce the problem:

  1. Set up Harbor registry (v2.10.2) and Trivy (goharbor/trivy-adapter-photon:v2 .10.2) in a Kubernetes cluster.
  2. Attempt to initialize a scan of an artifact via the Harbor API by sending a POST request to the /scan endpoint.
  3. Observe the inconsistent behavior.

Versions:

Additional context:

409 Response

harborapi_scan_400_blurred

202 Response

harborapi_scan_202_blured

reasonerjt commented 1 week ago

@kon-foo Could you please reproduce the issue (both success and failure in scan) and collect the logs of nginx, harbor-core, harbor-jobservice and trivy-adapter pods?

Could you please also let me know how Harbor was deployed in your env? What makes you feel it may be an issue with your configuration and environment?

kon-foo commented 1 week ago

@reasonerjt Thanks for looking into this. Harbor was deployed using this helm chart. These are the images in use:

Component Image
harbor-core goharbor/harbor-core:v2.10.2
harbor-database goharbor/harbor-db:v2.10.2
harbor-jobservice goharbor/harbor-jobservice:v2.10.2
harbor-portal goharbor/harbor-portal:v2.10.2
harbor-redis goharbor/redis-photon:v2.10.2
harbor-registry goharbor/registry-photon:v2.10.2 & goharbor/harbor-registryctl:v2.10.2
harbor-trivy goharbor/trivy-adapter-photon:v2.10.2

Here are the logs:

This time I actually had to hit the API ~40 times before getting a 202. Core fails to ping the scanner 39 times:

2024-10-01T05:34:15Z [ERROR] [/controller/scanner/base_controller.go:299][error="v1 client: get metadata: Get "http://release-registry-harbor-trivy:8080/api/v1/metadata": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" requestID="27fa31aade1ae4e2a9e422a198fd0544"]: failed to ping scanner
2024-10-01T05:34:15Z [ERROR] [/controller/scanner/base_controller.go:265]: api controller: get project scanner: scanner controller: ping: v1 client: get metadata: Get "http://release-registry-harbor-trivy:8080/api/v1/metadata": context deadline exceeded (Client.Timeout exceeded while awaiting headers)

Before finally succeeding:

2024-10-01T05:35:45Z [INFO] [/server/middleware/security/robot.go:71][requestID="9553b0d0-4924-4218-93fc-e4d8891358f3"]: a robot security context generated for request GET /service/token
2024-10-01T05:35:53Z [INFO] [/pkg/task/dao/execution.go:471]: scanned out 1 executions with outdate status, refresh status to db
2024-10-01T05:35:53Z [INFO] [/pkg/task/dao/execution.go:512]: refresh outdate execution status done, 1 succeed, 0 failed

What makes you feel it may be an issue with your configuration and environment?

I just added this to emphasize that even if it had something to do with our conf/env, I would consider this unwanted behavior, because the mime-type is not the problem and the error message is misleading. While it wasn't me who deployed harbor in our cluster, I am unaware of any unusual configurations, but the failing scanner pings make me feel like it could be a networking or permissions misconfiguration.

Thanks for your help and let me know if you need further information.