Closed forry closed 4 months ago
Hi @forry
Quick comments first:
--channel prebuild
: please avoid that. Do not use channels to denote package maturity anymore. This should be made either part of the version (as with prereleases or build metadata) or using different server repositories to store the packages depending on the stage.Surprisingly, it is building a dependency graph even though the recipe states it is an application.
This is expected and by design. To compute the package_id
of the package it is necessary to compute the dependency graph, as the package_id
depends on the dependencies versions and other configuration. It doesn't matter if it is an application, on the contrary, being an application it means that such an application must be rebuilt whenever the static libraries dependencies change, even if they change only the revision.
But I would expect that it wouldn't matter since the differences are in parts deleted in package_id()
package_id()
is evaluated last, when all dependencies have been resolved, because as said, it is a function of many factors, and current settings, options, etc, can also affect the versions of the dependencies. So it would be a chicken and egg problem. So it is necessary to first compute the full dependency graph, then compute the package_id
for the binary.
Ok then I tried to add --skip-binaries which then exploded on the error in boost. The recipe is the same as in the conancenter (downloaded from git today) but I reexported it under a different user/channel for further usage.
It looks like that there is there some potential to improve the UX, capture the error and show a better error message instead of a trace. Or maybe it is that you used -vvv
command line argument?
The error is expected, as long as you are trying to avoid the binaries, but it is not possible in this case to avoid the binaries, they are needed in the generate()
step, in the same way they were needed and used when cmake-conan
called conan install
it was calling that generate()
method.
So I am not sure what is the main issue or what I could be missing. If the binary was built before, all the dependencies should already be in the cache, no need to skip binaries or anything like that, just let conan export-pkg
do its thing and compute the package_id
?
Maybe I don't understand the conan in the core. But what I want to achieve here are two things. Let's start from the consumer side: I want to download a package that has the exe file in the bin folder and copy it somewhere (or maybe run it, there shouldn't be a difference). Let's say that I have the exe beforehand as a prebuilt binary. In this case, I assume the simple conanfile.py with the package method and setting only as os
and arch
would be sufficient (as the tutorial "Package prebuilt binaries" says).
Then comes the second part - the producer side. Here I need to produce said binary. Normally, I could use just a simple conanfile.txt with the build time dependencies (if that is possible) or I could use the conanfile.py with CMakeDeps generator and requirements as I did. And since only the producing of the exe with the cmake driving my build is my goal I might not care about the package_id and the rest correct?
As a side note, since build_requires
is deprecated, I tried tool_requires
, but that does not generate cmake config files for the find_package :(.
My problem is when I try to put those parts together and have one conanfile for both ends. I thought that the export-pkg
command is exactly for when you have the result already at your disposal and you just want conan to put it into the local cache as a binary package of whatever profile you tell him and that's it, no question asked. I was mistaken.
I'm trying to understand how it works and what is wrong with my though process.
export-pkg
would want to build something or is there a command that I'm missing that is to tell conan just don't question anything, and take these artifacts to the cache under this profile?export-pkg
with the same profile as the cmake does in the build context help (I suppose so)? But why do I need it if all I care for now is just to copy the thing to cache where the consumers only care about the os, arch. How do I tell that to conan?I suppose that the compiler information for a standalone app for consumer might be useful if the runtime is dynamic and he puts many of those applications together to have only one version of the runtime. But I don't care (the runtime is staticly linked in the app), how do I tell that to the conan while maintaining the possibility to install the dependencies for the build at the same time?
(edit)
It looks like that there is there some potential to improve the UX, capture the error and show a better error message instead of a trace. Or maybe it is that you used -vvv command line argument?
No as said I only added the --skip-binaries
so the whole command was:
conan export-pkg . --user lexocad-external --channel prebuild --version 1.0.9 --skip-binaries
I thought that this would prevent him from wanting those binaries. I have no problem that conan is building deps graph (well maybe a little). I have a problem that he does not realize that for exporting the package he does not need those deps' binaries (and even more when I thought I told him that with the --skip-binaries
.
So first, why is that export-pkg would want to build something or is there a command that I'm missing that is to tell conan just don't question anything, and take these artifacts to the cache under this profile?
The conan export-pkg
per se is agnostic. It can perfectly take some precompiled binaries and package them, without a build system at all, with just a very simple conanfile.py
file with barely a package()
method. It is the fact that you are adding requires
and generators
such as CMakeDeps
to that conanfile.py
that defines that such a conanfile.py
has dependencies. And if it has dependencies, those dependencies affect the package_id
and they need to be computed. The documentation of https://docs.conan.io/2/tutorial/creating_packages/other_types_of_packages/package_prebuilt_binaries.html#packaging-already-pre-built-binaries conanfile.py
doesn't have any requires
or build system at all.
Is conan not the right tool for this job and I'm abusing it?
It seems that you are doing a lot of hops and extra steps that Conan can implement in just one single conan create .
command. No need to use cmake-conan, no need to build locally and then export-pkg, no need to care of all of that. Conan designed a conan create
command for this use case. The abuse happens when you are using the "local flow" that is intended for developers working on the consumer code, as consumers of dependencies, and then the "export-pkg" that is intended to package pre-compiled binaries, and expect those pre-compiled binaries to somewhat be independent of the dependencies. If the dependencies are declared in the conanfile, they are part of the binary, and Conan will model them as part of the packaged binary.
Those two (local flow + export-pkg) can work together without much issues, this is well tested, you can do the local flow + export-pkg, but the export-pkg
command will indeed compute the dependency graph, to compute the right package_id
, because it must produce exactly the same package_id
that would be computed if doing a conan create
.
So my recommendation would be to simplify all of that and use conan create
, or just not be concerned about export-pkg
needing the dependencies, because the dependencies were needed in the first place to build the binary that you just build, so it is not an issue if export-pkg
needs them.
Should I separate the recipes into two as mentioned?
I am still not sure what problem you are trying to solve. Both using conan create .
and the local flow + export-pkg works fine with 1 single conanfile.py
. Yes, if you try to do --skip-binaries
it will fail, but if you don't do it, it works without issues, isn't it?
Would calling the export-pkg with the same profile as the cmake does in the build context help (I suppose so)? But why do I need it if all I care for now is just to copy the thing to cache where the consumers only care about the os, arch. How do I tell that to conan?
You can make the application independent of the versions of the dependencies with different mechanisms, mostly the "modes": https://docs.conan.io/2/reference/binary_model/dependencies.html, which can be defined globally in conf
, in recipes package_id()
for their dependencies. Still, the dependency graph will be computed first, so the dependencies must be there. As commented above, it shouldn't be an issue, the dependencies were already there in the first place to build the binary.
I thought that this would prevent him from wanting those binaries. I have no problem that conan is building deps graph (well maybe a little). I have a problem that he does not realize that for exporting the package he does not need those deps' binaries (and even more when I thought I told him that with the --skip-binaries.
I think you are missing one of the important points of Conan. If a dependency of the application changes, lets say, a static library and it does an important bug-fix, and the application is not re-built, then the application that linked statically will contain the bug. Conan package_id
computation makes sure that an application knows that it must rebuild itself if some of its static library dependencies changed (but it can also be smart to decide that the application doesn't need to be rebuilt, for example if the dependencies are shared libraries instead). So the package_id
model is critical in this aspect, and the effect of dependencies in the package_id
cannot and shouldn't be discarded without care.
A different story is if you want to do a full "vendoring", that completely isolate the dependencies. This is possible since 2.4, see https://docs.conan.io/2/devops/vendoring.html, and it provides full isolation of the dependencies, they don't even need to exist once the binary is built, and they will never be part of the package_id
. This feature can be very useful in a variaty of scenarios, but it shouldn't be applied without care, because for example it removes the package_id
features above. Now Conan will not be able to know that an application binary became obsolete because some of its dependencies got fixes, and it will be the full responsibility of the users that they will need to manually trigger the re-build of the vendoring package, most likely under a new version of such vendoring package.
I have been checking the vendoring package feature, and at the moment it doesn't seem to support the export-pkg
flow, but this is something that can be checked, it might make sense to try to expand it if possible.
So to summarize:
vendoring packages
feature to achieve this, though this might need some checks from our side to see if it can work with export-pkg
and not only create.conan create .
with the conanfile.py
and be done, no need to go through all this process.
export-pkg
uses dependencies, because those dependencies had been installed in the first place already by the previous conan install
call.Update: I have tried the vendoring packages
feature with export-pkg
:
package_id
independent of the dependencies, it can be used later without needing the dependencies at all.conan export-pkg
time it works without problem, but it still needs the dependencies installed locally (they should have been installed before at conan install
or conan build
anyway)There is another reason that I forgot above, is that the package()
method can use tools existing for example in tool_requires
, for example CMake
, because the package()
contains something like cmake.install()
, and depends on using CMake executable from a Conan package. The package()
method can also act as a "repackager" and copy files from its dependencies into its own final package. These are valid use cases that have been reported and requested to Conan before by other Conan users.
This is an extra reason to have the conan export-pkg
computing the dependencies and even using them in the conan export-pkg
command.
It seems that you are doing a lot of hops and extra steps that Conan can implement in just one single conan create . command.
The conan create
, to my knowledge, copies/exports the recipe to the local cache and then tries to build it in the cache starting with the source download (and then it leaves the mess there). We are using private GitLab and their CI for building those packages, when the job begins the private repository is already present at the job's workspace (I don't need to supply the keys or other auth for that repo in the job runner). I suppose that conan couldn't get the sources out of the box since the git would need to authenticate, thus making it all part of the conan is troublesome (I can't do it myself). Also, the developers of those packages work with local flow using mainly cmake and conan is silently in the back getting deps for them. Configuring the CI scripts the same way the devs work (almost, they're using GUI) was simplest. Also, there are other troubles combining this approach with the conan create
as already discussed at conan-cmake issue 647.
All of these company internal packages are consumed only as prebuilt binaries but someone is developing them (locally) and he also wants to use conan for getting his dependencies. If his dependencies change, e.g. for a security patch, we would want him to make a new version of the package. So here the vendoring seems to be just what we need for some of them. I'll give it a try.
Since I was trying conan before even v1 I periodically return to it after a long time. I'm just trying to fill in the blanks. So, humor me for a second, if you need dependencies to compute the package id, why do you need to have the respective binaries (potentially huge) in the local cache if you know, that you won't be building anything? Why don't you just settle for the recipe? What if the export-pkg
and upload are, for some reason, running on a different machine than the build (and the artifacts are just copied there through some CI mechanism)?
Yes, if you try to do --skip-binaries it will fail, but if you don't do it, it works without issues, isn't it?
Without --skip-binaries
it won't work unless I give it the exact profile that the conan-cmake generates (as said in the OP), now I know why, thanks to you. But I suppose that the vendoring will be very useful for us since the dependency graph building isn't working as I would like conan to only track those dependencies that could affect the consumer - propagating static and header-only deps beyond the build-time for the standalone application package with static runtime seems very weird, am I missing something?
Thanks!
(edit)
I tried adding the vendor = True
the the above-mentioned package and it still fails on trying to find a pdfium static lib. Same as the first error in the OP.
The conan create, to my knowledge, copies/exports the recipe to the local cache and then tries to build it in the cache starting with the source download (and then it leaves the mess there).
Not necessarily. For packaging your own code, an exports_sources
typically takes care of that, no source()
method necessary at all, no git-clone
or anything like that. When a recipe is in the same repo as the source, a simple exports_sources = "src/*", "..."
is usually enough.
Also, the developers of those packages work with local flow using mainly cmake and conan is silently in the back getting deps for them. Configuring the CI scripts the same way the devs work (almost, they're using GUI) was simplest.
If the approach works for you, fine. Just note that the create
enabled flow with the exported sources has the advantage that everything is fully reproducible and buildable from source. A new compiler version or architecture, etc, just conan install --build=missing
and it can build from sources those binaries without having to go one by one to recreate them.
So, humor me for a second, if you need dependencies to compute the package id, why do you need to have the respective binaries (potentially huge) in the local cache if you know, that you won't be building anything? Why don't you just settle for the recipe? What if the export-pkg and upload are, for some reason, running on a different machine than the build (and the artifacts are just copied there through some CI mechanism)?
For 2 different reasons:
tool_requires
in the package()
method. This happens for example when the package()
method contains a cmake.install()
and cmake/version
is a tool_requires
.package()
method. E.g. a package()
method that needs to collect the license.txt
files from their dependencies for compliance, and repackage them.So dependencies binaries must be installed and available in the cache for a fully functional package()
.
Without --skip-binaries it won't work unless I give it the exact profile that the conan-cmake generates (as said in the OP), now I know why, thanks to you. But I suppose that the vendoring will be very useful for us since the dependency graph building isn't working as I would like conan to only track those dependencies that could affect the consumer - propagating static and header-only deps beyond the build-time for the standalone application package with static runtime seems very weird, am I missing something?
This seems a different issue. Conan does not propagate headers or libraries for applications
that are used as tool_requires
at all. If they are used as regular requires
, then yes, because those are intended for libraries
. Also, I am not sure if I understand, I was not talking about the static runtime", which is typically referring to the compiler/system runtime, I was talking about the application linking with your own static libraries (very common approach), but both the application and static libraries could easily be using the
dynamic runtime`` of the system.
I tried adding the vendor = True the the above-mentioned package and it still fails on trying to find a pdfium static lib. Same as the first error in the OP.
Then the vendor = True
package, the construction of the package, that is both the install
, build
and export-pkg
command, all need to be aligned in the configuration of dependencies. It is the usage of the vendoring package that will become independent of the dependencies. But the install/build + export-pkg flow should still be consistent in the definition, you cannot be building with some configuration and then define a different one at the time of export-pkg.
Not necessarily. For packaging your own code, an exports_sources typically takes care of that
In our case, the sources are either at the public repo or the consumer shouldn't build the dependency himself and only rely on the prebuilt binaries.
This seems a different issue. Conan does not propagate headers or libraries for applications that are used as tool_requires at all. If they are used as regular requires, then yes, because those are intended for libraries. Also, I am not sure if I understand, I was not talking about the static runtime", which is typically referring to the compiler/system runtime, I was talking about the application linking with your own static libraries (very common approach), but both the application and static libraries could easily be using the dynamic runtime`` of the system.
In my case, I have an app with the static runtime linked with one static and one header-only lib. That's why I'm so confused why would I need those for exporting a single .exe, I didn't though about the license.txt use case (in my mind the deployer or cmake would take care of that, I'm really just using export-pkg to copy already prepared stuff), good point.
Then the vendor = True package, the construction of the package, that is both the install, build and export-pkg command, all need to be aligned in the configuration of dependencies ...
Is the vendor = True
the same as manually setting all the requirements as visible = False
or are there other intricacies?
In my case, I have an app with the static runtime linked with one static and one header-only lib. That's why I'm so confused why would I need those for exporting a single .exe, I didn't though about the license.txt use case (in my mind the deployer or cmake would take care of that, I'm really just using export-pkg to copy already prepared stuff), good point.
Yes, there might be cases where they might not be needed, but Conan cannot know it, it fully depends on the user recipe logic.
The deployers might not be able to gather all licenses of transitive dependencies if for example there is a "vendoring" package in the midle. Or if there are shared-library
packages that package static libraries as implementation details, when installed them, those binaries will be actually skipped, so Conan knows that it only needs to deploy the shared-library, which might need a copy of the licenses re-packaged to be compliance. Note that the installation and deploy cases uses a fully computed dependency graph, with built binaries, while the export-pkg
is creating a package that doesn't exist yet, it is a different use case.
Is the vendor = True the same as manually setting all the requirements as visible = False or are there other intricacies?
It is very different:
vendor=True
the dependencies are not expanded at all. The recipes might not even exist (no access to the server, not in the cache)visible
only affects to propagation of traits, conflicts, etc, but the dependency is always part of the graphvendor=True
the dependencies versions do not affect the package_id
of the "vendoring" package at all.visible=False
the dependencies versions affect the user of those dependencies.vendor=True
package as dependency is blocked by default, and needs an extra conf to enable building it from source.Thank you very much. So to sum it up for future reference. If I don't want the consumer to care about my dependencies I use vendor = True
. If I don't want conan to care about dependencies when I'm exporting, I need to use a separate conanfile without requirements for packaging.
What is your question?
Hi, I'm now trying to package an application that contains a single exe file. On the server it is build by cmake/msvc and then packaged and uploaded. Since it uses a static CRT due to a pdfium library the settings for build supplied by cmake are different than the one that are in the default profile when packaging. But since it is a standalone application and those settings are deleted in
package_id()
it should not matter, at least I thought.The app builds fine with this, using the conan-cmake:
This is what conan-cmake outputs as a profile. The conan-cmake is slightly changed (should not matter, builds fine):
Then I try:
Surprisingly, it is building a dependency graph even though the recipe states it is an application. It is built with one static and one header library. However, it fails to find the static one since the default profile differs(?). But I would expect that it wouldn't matter since the differences are in parts deleted in
package_id()
. I only need the compiler and build_type for the build step.Ok then I tried to add
--skip-binaries
which then exploded on the error in boost. The recipe is the same as in the conancenter (downloaded from git today) but I reexported it under a different user/channel for further usage.My wish for the resulting package would be a single exe file with os, and arch settings in the conaninfo for it is then packaged as a part of a different application that builds with different settings (more like the default profile). It was working for conan 1 though the requirements were set as private there. Why doesn't this work out of the box? I would like to have
settings = "os", "arch"
but thenconan install .
wouldn't work for the dependencies. Should I always set thevisible=False
in therequires()
for applications? What is then thepackage_type
for? Why doesn't it simply skip those dependencies?The recipe for pdfium states
package_type = "static-library"
. conan recipe for the app: conanfile.pyconan version 2.4.1
Sorry for the long read.
Have you read the CONTRIBUTING guide?