conan-io / conan-package-tools

Conan Package Tools. Helps with massive package creation and CI integration (Travis CI, Appveyor...)
MIT License
166 stars 70 forks source link

Execute post-processing callbacks after the packaging #235

Open matlo607 opened 6 years ago

matlo607 commented 6 years ago

Today in my company, we use conan to build and generate packages. However we don't consume them in production with conan. We download them from a repository manager such as Sonatype Nexus or Artifactory. We store these packages following an internal convention layout.

I am facing frequently the need to repackage conan's artifacts once they are built to our internal formats in order to upload them to our repository manager for consumption.

You already propose to upload the recipes and artifacts to multiple conan servers. The idea would be to provide to the user a way of adding callbacks that would run after the build of the artifacts to bridge the gap between conan artifacts and an abstract target format used by various package managers.

The goal is to avoid the manual steps of repackaging. The idea can be extended to creating rpm, deb archives, etc... and upload them on a mirror via ssh, ftp, ...

Three kinds of plugin seems to emerge from the use cases I have seen up until now :

Example:

#!/usr/bin/env python

from cpt.packager import ConanMultiPackager
from cpt.plugins import archiver, uploader

class MyNexusServer(uploader.NexusPlugin):

    host = "..."
    port = "..."
    repo_id = "..."
    repo_name = "..."

    def upload_info(self, pkg_info, settings, options):
        self.version = pkg_info.version
        self.artifactId = "{}-{}_{}".format(pkg_info.name, settings.os, settings.arch)
        self.groupId = "..."
        self.classifier = settings.compiler

class MyFtpServer(uploader.FtpPlugin):
    host = "..."
    login = "..."
    password = "..."

if __name__ == '__main__':
    builder = ConanMultiPackager(build_policy="missing", username="mlongo", channel="testing")
    builder.add(settings={"arch": "x86_64", "build_type": "Debug"})
    builder.add(settings={"arch": "x86_64", "build_type": "Release"})
    builder.run()
    callbacks1 = {
        "archivers": [
            archiver.ZipPlugin(), # use default zipper with a default naming
            archiver.TarGzPlugin()
        ],
        "uploaders": [
            MyNexusServer(),
            MyFtpServer()
        ]
    }
    # recursive: apply these callbacks to the package and its dependencies
    builder.post(callbacks=callbacks1, recursive=True)
    callbacks2 = {
        "archivers": [
            archiver.RpmPlugin()  # use default zipper with a default naming
        ],
        "uploaders": [
            MyFtpServer()
        ]
    }
    builder.post(callbacks=callbacks2, recursive=False)

    # we could maybe merge the existing multi conan server uploading with this
    callbacks3 = {
        "uploaders": [
            # by default, without argument provided, upload to all conan servers
            ConanRemotes(servers=["conan-server-1", "conan-server-2"])
        ]
    }
    builder.post(callbacks=callbacks3, recursive=True)

Today, what is your advise to deploy to production libraries built with conan ? What do you think about the above proposal which attempt to solve the mentioned issues?

lasote commented 6 years ago

What do you think about the above proposal which attempt to solve the mentioned issues?

I think your proposal is out of the scope of the cpt, the good news are that you shouldn't need to implement anything in cpt to achieve your goals:

Today, what is your advise to deploy to production libraries built with conan?

My recommendation is to manage Conan packages and upload them to a conan server (Artifactory for example). Then if you have to generate other file formats, you can, in the following CI stage, use the deploy feature or the conan imports command feature to import to a local directory all the needed libraries/executable and package them with another tool.