Closed aegershman closed 6 years ago
We have created an issue in Pivotal Tracker to manage this:
https://www.pivotaltracker.com/story/show/157682074
The labels on this github issue will be updated when the story is started.
We currently use the pivnet CLI for this. It would be great to have it supported in the resource.
We added this as a story to our internal Tracker.
Unfortunately concourse doesn't currently let you pass files to a get
step. Only put
can read a set of inputs, get
is only passed an output directory: https://concourse-ci.org/implementing-resources.html. To allow get
to pull a dynamic value from a file we'll have to wait until the dynamic build plan generation feature lands: https://github.com/concourse/concourse/issues/684. The only other alternative that I know of is doing a weird hack like put.params.action: get-stemcell
.
Something like the file-downloader-resource appears promising. You feed it a git
-based yml
config repo and it pulls the corresponding pivnet
product based on the content of the product-tile.yml
file.
It has the potential to solve my use-case, the only problem is it doesn't currently support wildcard *
patterns for matching product versions. E.g., it can't match 1.12.*
Something to keep an eye on.
Makes sense. I'm concerned the file-downloader-resource
doesn't actually work in all cases. Specifically, if the files on Pivnet change behind the scenes, even though your git repo has the same config you could get different files downloaded.
Or, due to caching, you would get the same files as before even though they have changed behind the scenes. Either is potentially problematic.
As @ljfranklin says, this type of behavior isn't currently supported by Concourse until "Dynamic Build Plans" feature lands.
These are feature requests that bleed the versioning of a resource with the configuration of the resource. Versioning is a core concept built into Concourse, so that it can handle things like caching, version locking, and correct triggering.
Having a resource reading a dynamic configuration and then versioning off that can break things and cause inconsistent experiences. Also, not every resource can do this.
@aegershman, I'd ask, why you want an individual resource to do this, rather than having the primitives of a pipelines (ie get
with a specific version
)?
The process that you describe as "writing to file" can be solved by a checked in and versioning pipeline YAML. When the tile from sandbox
passes, it could write to a vars file that is consumed by the dev
/prod
pipeline.
If you are doing credhub
integration, you might even be able to not have to file -- if that is preferred.
That's exactly what we're doing right now, actually.
sandbox
is configured to take the latest patch
version available, e.g. x.y.*
sandbox
, it creates a PR with the concrete version to git in the dev
foundation folder.git
and sets
it in the pipeline for dev
The whole purpose is to have everything be as automated as possible, with the only human interaction involved being a merge of an auto-generated PR to master.
But in order for the version
to be set to whatever is in the contents of a params.yml
file in git, there needs to be a separate pipeline which is listening for changes on the git repo & set the pipeline with the updated value. Having the automation re-fly
the pipeline requires more moving parts and more credentials (for automation to login to concourse and set
the pipelines).
Ultimately this might be undue complexity for the pivnet-resource; but just sharing my use case 👍
tl;dr
What I want to do is have the
product_version
configured in aproduct-name.yml
file, which can be passed in to thepivnet
resource onget
steps as aparam
. This way theproduct_version
can be dynamically pulled from a config repository rather than requiring afly
to be run on the pipeline.problem
To download a specific version of a product, you have to provide a
product_version
regex pattern. This param is configured at "instantiation time", e.g., when youfly
-up the pipeline.use case
I am working on a project to do tile upgrades across my foundations. I have 3 foundations,
sandbox
,dev
, andprod
. I want my tile versions to be promoted through each foundation--dev
accepts the version that passedsandbox
,prod
accepts what passeddev
.Now, you could have a pipeline which ties all my foundations together and used
passed: [sandbox]
blocks to ensure the proper version gets passed. But you would have to put all my foundations in a single Concourse team & a singlepipeline.yml
. It's messy. Plus, what if I have a tile in sandbox that I don't want in dev and prod? Or what if I want to experiment withproduct-A-1.5.x
in sandbox, butdev/prod
are usingproduct-A-1.4.x
&& I don't want to promote thesandbox
version to dev? What if I want the option to continue promoteproduct-A-1.4.x
?Okay, so here's my vision--
I want my
upgrade-tile
pipeline insandbox
to take the latest version ofproduct-A-1.4.x
, then when it successfully applies, I want it to write the version that successfully passed to a file, then use that file to create a GitHub pull request to thedev
foundation config repo. A platform operator could then merge in those changes; then whenstage-and-apply-changes
runs indev
, it will use the product_version that was specified in thedev
config repo.Thank you for your time and consideration