conan-io / conan

Conan - The open-source C and C++ package manager
https://conan.io
MIT License
8.15k stars 972 forks source link

[feature] Provide multiple build() methods #9082

Open tiolan opened 3 years ago

tiolan commented 3 years ago

Assuming one uses a cross compilation toolchain from within Conan. Assuming further you need to build two different types of binaries (say native and cross) during that build. Example: Protoc for executing it on amd64 and Protobuf entirely in aarch64 for linking against another aarch64 executable. When the build generator (e.g. CMake or a special custom one) uses a cache, this might lead to trouble.

Proposal: It would be nice to have something like a build_native() one can optionally override, in parallel to the already existing build(). This build_native() would need a dedicated self.build_native_folder, so that you have each build type strictly separated. In the packaging phase one could then either copy / take from the self.build_folder or from the self.build_native_folder. One could also implement N different build() methods in kind of an abstract way, in case you need more than two independent builds.

Known and working solutions for now:

memsharded commented 3 years ago

Hi @tiolan

Some quick feedback:

In build() first build native binary, then use tools to delete the cache, copy binary in target directory, then build cross binary (Maybe existing CMake wrapper already offers deleting cache?)

Don't do that, don't handle the cache directly from recipes, that might break anytime, not documented and unexpected behavior. You can do what you want within the build_folder, but not with any other folder outside that

In build() create directories using tools per each required build, run the cross, native build in those dedicated directories (but they are located in self.build_folder).

I am not sure I understood what you are trying to achieve. To implement different builds, it is typically something like this:


def build(self):
      if some_cross_build_condition:
              self._build_cross()
      else:
              self._build_native()

def _build_native(self):
      # do some native build, e.g. calling CMake

def _build_cross(self):
      # do some cross_build, e.g calling autotools

There is nothing special to do with the folders, Conan automatically models and manages that every build will have its own package_id and will do a clean build and create an independent package for it, maybe I am missing something?

tiolan commented 3 years ago

Ah I need to be more precise. It is not about creating one package for native and another package for cross from the same recipe, based on a condition. It is about creating one package, that contains e.g. a code-generator built for amd64, but libraries built for aarch64. The code generator itself is built from source using native toolchain, needs to be executed in order to generate source for the libraries. The libraries are built from (partly generated) source, using a cross toolchain.

The reason you need this, is that when you use a cross-compiler, the code generator needs to be executed in the native environment (e.g. amd64), but the libraries are required for the cross environment (e.g. aarch64)

Don't do that, don't handle the cache directly from recipes, that might break anytime, not documented and unexpected behavior. You can do what you want within the build_folder, but not with any other folder outside that

Yes, I meant deleting the cache inside the existing build folder, not outside. Then do another build inside the same build folder.

So to sum up, I want to execute build() twice (or in general more often, with different names) using different toolchains, having separated build folders, but during one single conan create invocation for one single package.

memsharded commented 3 years ago

It is about creating one package, that contains e.g. a code-generator built for amd64, but libraries built for aarch64.

Now I see what you mean.

This is a case that is not supported by Conan in that way, and it will not be. Binary packages are built for one configuration (compiler, architecture, etc), having a package that contains binaries for different architectures would change too many things, and break the core Conan model.

That doesn't mean that it is not possible to support this case. Something similar is already being done with packages like protobuf, that can be used as a build_require in the build context, lets say in Windows because we are cross compiling, and exactly the same protobuf package can be used as a regular require, in the host context (lets say that we are cross building from Windows to Linux ,in this case the build context is Windows, the host context is Linux). The key points are:

This model is already proven to work and scale much better, while maintaining a good representation and modeling of the binaries and binary compatibility, I recommend trying it.

tiolan commented 3 years ago

Yes I actually know the concept they follow for protobuf and also for grpc.

In my opinion it has disadvantages. My understanding (and please correct me), is that you can then provide a build profile as well as a target profile when you invoke conan build.

In order to use conan build, you have to create a conanfile.py and describe how your application is to be build, not only how the packages are build. That enforces you to build your application itself using conan. IMHO a package manger should not enforce you to also build the application using the package manager, but should only provide packages.

You can of course run conan install and then your custom build engine like cmake --build, but this then does not know anything about build dependencies and build profiles. (so my understanding)

You can work around this by creating a "super cross package" that is generated by copying the native binaries from the native package and the cross binaries from the cross package.

All in all a very complex process for an easy task. Shown e.g. by the complexity of the protobuf and grpc recipes available in the center.

Anyhow thank you for the good discussion and feel free to close the request.

memsharded commented 3 years ago

In order to use conan build, you have to create a conanfile.py and describe how your application is to be build, not only how the packages are build. That enforces you to build your application itself using conan. IMHO a package manger should not enforce you to also build the application using the package manager, but should only provide packages.

Not necessary, you can have packages without build() method at all, build outside of Conan if that is what you want, and use conan export-pkg to create the packages. Still, the model is sound, you have one package binary for one architecture and other package binary for the other architecture. You are not enforced to use Conan to do the actual build, you can use it with way simpler recipes just to package it (there are many users out there using this flow).

Still the way to consume those binaries is way better modeled if you can clearly specify that you want to use in the build context this architecture and in the other context the other architecture. This might have advantages for deployment as well, it scales better across multiple platforms and configurations.