Closed brikendr closed 4 days ago
This question is specifically directed towards the artifactCleanup
plugin.
The above image showcases the bug this issue is trying to address. The conan artifacts need to be treated as a whole when picked up by the cleanup artifact and not file-by-file, otherwise the whole artifact gets corrupted.
The 0.2.15
artifact is a corrupted one since the index.json
and the .timestamp
files have been removed by the cleanup artifact. The 0.2.16
is how the whole conan artifact should look like. I guess if the artifactCleanup plugin does not support checking the package as a whole before deleting (in this case it would be to check the index.json
file on the root path of the artifact), then we cannot use it to cleanup conan artifacts.
Looking forward for a clearification on this.
Yes, this is a problem the tool generally has for multifile type packages (like Docker). Potential solutions here would be to not use it in those packages, introduce package-type intelligence, or at the very least document the packages it works well for (and those it does not).
We have as well severe problems with getting the repository sizes under control and realised as well recently that the cleanup plugin is corrupting Conan repositories (similar what we discovered 2 years ago with Docker images and needed to recover a lot of images that got corrupted).
In my opinion Artifactory should come up with a builtin mechanism for cleaning up artifacts (like Nexus does) and ensure that the plugin is working on supported repositories only without causing any damage on non-supported repositories.
I looked a bit deeper into other repository types we use and I believe that the cleanup plugin will corrupt as well the following repository types:
.npm
folder which contains a package.json
for each package. This file include all the revisions of an npm package published into repository. This file has as well a download count of zero and would be deleted by the plugin..specs
folder in the repository root, each package has a .podspec
file which contains meta-data about the package. The download count of this file is zero even if the corresponding .tar.gz
file was already downloaded by the pod
cli tool.current_repodata.json
, repodata.json
were downloaded. But based on the download count it does not match the download count of the artifacts itself. So seems that it might happen that those files have as well a download count of zero and could potentially be removed by the cleanup plugin.Packages
was not downloaded but the other files Packages.bz2
, Packages.gz
not. Not sure whether this could cause any issues on a Debian repository.This one seem to be safe:
.pypi
folder inside the repository root contains a simple.html
file which gets updated whenever a new version or package is deployed.Hi,
Thank you for your inquiry. Seems like there is a workaround already present in #47 and #52.
If you still encounter issues with the cleanup plugin, we recommend moving to our new product feature designed to handle cleanup tasks more effectively. This feature has reached General Availability (GA) and offers enhanced capabilities for maintaining and managing artifacts.
You can find more information and guidance on using this new feature in our Cleanup policies.
We’ll close this for now, but if you have any further questions, feel free to reopen it!
Thanks
Hi,
I was just wondering how does this plugin handle conan repository artifacts. It seems that sometimes the artifact deletes a metadata file called
index.json
on a specific uploaded conan artifact.When this artifact is deleted, conan is not able to download the artifact any longer!
My question would be: How do you determine if a conan artifact directory should be deleted or not?