conan-io / cmake-conan

CMake wrapper for conan C and C++ package manager
MIT License
823 stars 250 forks source link

How can a toolchain be downloaded and used to build other dependencies of CMAKE project. #660

Open strimble25 opened 1 month ago

strimble25 commented 1 month ago

I am working on a CMAKE project with multiple target configurations and have been successful in migrating all of the dependency management to Conan2.0 using conan_provider.cmake, except for the toolchain(s) and associated sysroot(s).

See below the existing setup that works, i.e. when running cmake --preset native cmake-conan correctly fetches and builds protobuf using the native gcc and running cmake --preset target builds it with the custom toolchain. The problem with this setup is that the toolchain and sysroot are at fixed locations but they are being actively developed and should be managed as dependencies for the project. Given the toolchain and sysroot have been made available as packages on a private Conan repository, is it possible to fix the below to properly include the toolchain and sysroot in the build before all other dependencies are built? To be clear the toolchain and sysroot do not need to be built themselves, they simply need to be downloaded and made available in some way, such as via find_package/program or environment variables.

I tried to add tool_requires = "custom_toolchain/1.0.0" to conanfile.py and then find_program(toolchain-gcc REQUIRED) to custom_toolchain.cmake, however this enters a chicken-and-egg scenario where CMAKE is trying to find a compiler that Conan has not yet made available but cmake-conan requires the compiler to be set to detect the host settings correctly before getting the dependencies.

conanfile.py:

from conan import ConanFile
from conan.tools.cmake import CMakeDeps

class MyRecipe(ConanFile):
  settings = "os", "compiler", "build_type", "arch"
  requires = ["protobuf/5.27.0"]

  def generate(self):
    deps = CMakeDeps(self)
    deps.generate()

CMakePresets.json:

...
"configurePresets":
[
  {
    "name": "native"
    "cacheVariables":
    {
      "CMAKE_PROJECT_TOP_LEVEL_INCLUDES": "conan_provider.cmake"
    }
  },
  {
    "name": "target"
    "cacheVariables":
    {
      "CMAKE_TOOLCHAIN_FILE": "custom_toolchain.cmake",
      "CMAKE_PROJECT_TOP_LEVEL_INCLUDES": "conan_provider.cmake"
    }
  }
]
...

custom_toolchain.cmake:

...
set(CMAKE_C_COMPILER   /path/to/toolchain/toolchain-gcc)
set(CMAKE_SYSROOT      /path/to/sysroot/)
...
memsharded commented 1 month ago

Hi @strimble25

Thanks for your question.

This cmake-conan integration implemented with CMake dependency providers is targeted to the main use case for which CMake dependency providers was designed, regular library dependencies.

The problem with tool_requires packages, which it seems is what you want to use for your toolchains, is that they are not injected via find_package(), but instead they use very often environment variables. So conan install that is executed as a result of calling find_package() is mostly unable to inject any environment that might be necessary even much earlier than the first find_package() is found.

This would be probably a duplicate of https://github.com/conan-io/cmake-conan/issues/635, please check it, and consider closing this as a duplicate and tracking the other ticket. We will try to be creative and investigate a bit more if there is something else that could be done. But it doesn't seem very likely.

For these cases that escapes a bit what cmake-driven flows can achieve, we recommend the more "canonical" flow of using conan install + cmake .... We know it is an extra step and it requires a little bit more of effort, but in many cases it is totally worth if it also improves flows as configuring toolchains and other tool-requires.