Closed guban closed 2 weeks ago
Hi @guban
Thanks for your question.
download those exact binary packages listed in pkglist.json into the local cache on the consumer machine, and copy their binaries into a custom working directory?
It is possible to do a conan download -l pkglist.json
to download a specific package list from the server to the current machine. But the issue is that conan download
is not intended for deployment.
Deployments need a dependency graph. The recommended approach for what you want is to do:
conan install --lockfile=out.lock -pr:a=profileA --requires=my/1.0@companyA
Then, with the conan install
you can use deployers, you can do:
conan install --lockfile=out.lock -pr:a=profileA --requires=my/1.0@companyA --deployer=full_deploy
This is because installation and deployment is needed for also calling the generators
to generate the necessary files to consume those installed packages. The deployers are described in this blog post: https://blog.conan.io/2023/05/23/Conan-agnostic-deploy-dependencies.html
It is true that conan download
cannot implement deployers in their current form because they need the dependency graph to operate. If for some reason, the conan install
is not enough, the conan cache path pkg/version:package_id
gives the cache path of a package, it would be relatively easy to iterate the package list after a conan download
and do a copy. Maybe doing a Conan custom command could be easy and practical to automate this task, I think it should be relatively straightforward to implement such a command. Have you checked the custom commands? https://docs.conan.io/2/reference/extensions/custom_commands.html
Please let me know if this helps
Hi @memsharded,
It looks like "conan install" with a lockfile and a deployer covers my scenario.
I don't really need a package list, conan cache path
and custom command
(though it's nice to know these are available just in case).
I have a further question about the lockfile generated on the build machine.
I'd like to have this lockfile provided to consumers via Conan.
Consumers should be able to retrieve this lockfile from Conan server
when they know only the recipe and profile (my/1.0@companyA
and profileA
).
I'm considering to store lockfiles in the recipe's metadata
as files profileA.lock
, profileB.lock
and so on.
Is this usage of metadata within the limits of its intended use?
Is there a better alternative?
Thanks!
Consumers should be able to retrieve this lockfile from Conan server when they know only the recipe and profile (my/1.0@companyA and profileA). I'm considering to store lockfiles in the recipe's metadata as files profileA.lock, profileB.lock and so on. Is this usage of metadata within the limits of its intended use?
this is a very good question. Conan at the moment is not opinionated on that.
The most common storage of lockfiles for "final consumer" applications would be the git repo, as source. In that way, with git clone
+ conan install
all is good, and the changes and updates to the lockfile can be tracked in source.
For intermediate lockfiles in a graph, it is a more complex situation. See in https://github.com/conan-io/conan/pull/10261 we have a proposal in which lockfiles might be automatically stored and use in recipe metadata.
So your intuition about the metadata is good, it might be a good place to store lockfiles. At least for Conan, I am not sure yet if it would be easy for users if Conan doesn't automate it, because it requires some extra conan download
command, we still need to think a bit more about the UX.
Note that lockfiles in Conan 2 can be used to store multiple configurations, if they are intended to be consistent accross different configurations. So probably it is not necessary to have a profileA.lock
and a profileB.lock
, but just a single lockfile, and that will also simplify the process.
Even more, it is possible to use lockfiles from a downstream consumer in the graph dependencies. Lets say that you have a consumer project "app" in a Git repo, and that repo contains the lockfile for the "app". That lockfile can be used to build and lock any other package in the dependency graph of "app".
I reviewed #10261 and documentation on lockfiles to better understand the matter.
My use of lockfiles is quite limited. I don't need to use them as input when building packages. I don't need to track how that lockfile evolves in time; I don't need it to be version-controlled. What I need (at least, for now) is only to guarantee that when a consumer runs
conan install --requires=my/1.0@companyA -pr:a=profileA -l profileA.lock -d my_deployer
then all dependencies are resolved to those binary packages (up to package id) that were built on the build machine. My specific need is to give to the consumer instructions on how to download all needed pre-built packages without building anything locally. For my needs, it suffices to generate lockfiles from scratch during the build, and then make the generated lockfile available to the consumer.
After the lockfile (profileA.lock
) and Conan profile file (profileA
) are present in the recipe metadata,
the consumer can deploy pre-built packages as follows:
:: Consumer must provide the recipe and the *name* of a Conan profile file.
@set recipe=my/1.0@companyA
@set profile=profileA
:: Download recipe metadata from the Conan remote.
conan download %recipe% -r origin -m="*"
:: Get the recipe metadata folder in the local Conan cache.
conan cache path %recipe% --folder=metadata > m_dir
@set /p m_dir=<m_dir
:: Get all the binaries.
conan install --requires=%recipe% -pr:a=%m_dir%\%profile% -l %m_dir%\%profile%.lock -d full_deploy
I think this particular behavior is too specific for justifying its wrapping into a standard Conan command. Wrapping it into custom scripts will do.
When all conanfiles are combined into a single one, how can updates to that file be synchronized?
In principle, several build machines may build different profiles in parallel.
In my scenario, I avoid the need for such synchronization because for each Conan profile,
I have exactly one build machine that builds that Conan profile.
To keep it simple, I store profileA
and profileA.lock
separately from profileB
and profileB.lock
.
What if information stored in profileA.lock
is stored in profileA
instead?
Then, such "locked profile" would be the only file needed to consume the pre-built binaries:
the consumer could call conan install -pr:a=profileA
without the --requires my/1.0@companyA -l profileA.lock
options.
On the build machine, the conan install
command could transform the initial, "unlocked" profile
into a "locked" profile that has the additional [requires]
section
that locks all dependencies to those used during the build.
Anyhow, I've got a working solution to my problem. Marking the question as resolved. Thank you very much for your help!
When all conanfiles are combined into a single one, how can updates to that file be synchronized?
I think this PR that we are preparing to the docs can be interesting: https://github.com/conan-io/docs/pull/3799
It is a full new CI Tutorial
that describes some good practices, flows and tools to do CI efficiently, and how to build full products when packages change. I'd suggest having a look, even if it is source, it might be useful.
What if information stored in profileA.lock is stored in profileA instead
We had this more or less similar in Conan 1.X, the other way round, it was the lockfile file that contained a copy of the profiles. But this proved to not be very practical, as there is also variable information that depends on the specific machine, and also because having to manage multiple lockfiles was really disliked by many users. Having one single lockfile for all configurations seems to have way better acceptance.
What is your question?
Hello,
Could you please help me to figure out the second part of my scenario:
my/1.0@companyA
) and its dependencies, and b. upload all needed binary packages (including dependencies) to a conan server.Download binaries of that package, as well as its dependencies first into the local cache and then deploy them into a custom working directory
I implemented part (1) as follows:
Here, we build the package in local cache, create a package list and upload all packages in that list to the server.
The question is: how to do part (2)?
That is, how can I
pkglist.json
into the local cache on the consumer machine, andThat working directory is intended for wrapping those binaries in an installer MSI package.
On the consumer machine, I can use either of the files
graph.json
,pkglist.json
and maybe evenout.lock
.I studied lockfiles, package lists, deployers, but could not figure out the proper/recommended way of doing part (2).
Could you please help?
Thank you!
Have you read the CONTRIBUTING guide?