Closed BarrinMW closed 3 years ago
Hi Daniel,
There is just 1 pipeline definition per repository. A common scenario is that each application has a separate repository using the same template for the pipeline configuration. Some users have 1 repository for all Control-M workflows.
This code snippet might help (i'll publish this full example later this week):
for f in $path/*.json
do
ctm deploy $f
done
It basically loops over a folder ($path) and executes the deploy command for each file in that directory ($f).
This could be extended to loop over over folders as well.
Let me know if this helps.
Regards, Tijs
Thanks - that makes sense, have implemented similar which is working now
for f in $path/*.json
do
echo "building the checked in files by validating against the build environment"
curl -k -silent --fail --show-error --no-include -H "Authorization: Bearer $token" -X POST -F "definitionsFile=@$f" -F "deployDescriptorFile=@deployDescriptors/$BUILD_DESCRIPTOR" "$ENDPOINT/build"
done
Another option is:
I'm setting up something similar to what you've got in your jenkins CICD integration ( automation-api-community-solutions/2-cicd-tooling-integration/integrating-Control-M-into-DevOps-lifecycle/) but am struggling to understand how this approach will work for a project with multiple folders of jobs... would it be normal to need to create a new jenkins pipeline for each folder as I would think there could be hundreds of these in each project or would the script need to be adapted to cat all the jobs into one master file just for validating build