Hi team: we believe our postgresql database artifact contents are out of sync with what's actually in the S3 bucket. our s3 bucket size is 143TB. From our user's standpoint, the Harbor UI, swagger, docker push/pull all functioning as expected. But there's no way we have 143TB of active projects/repos/artifacts.
Is there a tool or method available that we can use to identify disconnects between the s3 content and the database content? if no tool or method, can you offer how you might go about finding the disconnects?
Does the s3 bucket contain image scan results (if so, it could explain our growth)?
What does the table 'Artifacts_trash' contain? Are they input to any harbor process or job?
In the 'blob' table, we have ~6600 records with status 'delete'. What are those records? Are they input to any harbor process or job?
This is rather unusual, but I think that this might have happened, for example when the GC can delete the files.
no such tool exist, IMO it has to be created so that it iterates over the Harbor (db) and S3 and finds layers and blobs and manifests not in Harbor but on S3.
I am not sure, we had some functionality storing data in S3. but you would see it in the bucket, as its top-level next to docker
..
Hi team: we believe our postgresql database artifact contents are out of sync with what's actually in the S3 bucket. our s3 bucket size is 143TB. From our user's standpoint, the Harbor UI, swagger, docker push/pull all functioning as expected. But there's no way we have 143TB of active projects/repos/artifacts.
Thanks.