indigo-dc / jenkins-pipeline-library

Jenkins pipeline library with common functionalities for CI/CD environments, mainly targeted for the implementation of the SQA baseline requirements from https://indigo-dc.github.io/sqa-baseline/
Apache License 2.0
11 stars 6 forks source link

Multiple pipelines in single service based repository #141

Closed BorjaEst closed 1 week ago

BorjaEst commented 3 years ago

Short description

I wonder if it is possible to configure multiple pipelines for a single repository.

Example user case

I have an application based on multiple container services. For example a web typical example:

~/Project_web$ tree
.
├── backend # Python based
│       ├── tests
│       ├── Dockerfile
│       ├── tox.ini
│       └── ...
├── frontend # Js (i.e. React) based
│       ├── tests
│       ├── Dockerfile
│       ├── <other automation file>
│       └── ...
├── more_tests # General Integration + Functional tests
├── <other automation file>
└── docker-compose.yaml

In the example above, note the following possible requirements:

I am not sure how would be the best/recommended way to implement the pipelines.

samuelbernardolip commented 3 years ago

@BorjaEst The use case needs to trigger different pipelines or the requirement is to use in parallel docker containers based on same image? Current version of JePL with docker-compose composer, allows to define the container name in config.yml, based on what was set in the docker-compose.yml deployment. This is mandatory so the required tests are applied to the expected environment deployed as a previous stage before starting the criteria evaluation. If there are dependencies between the criterion results, same container can be used. There can be also a edge use case, when using same container with multiple criterion tests. If a container snapshot needs to be created for each criterion test. This is not supported in current release, since the images are only pushed to docker registry in the end. JePL only supports CI mode for now and the containers are removed in the end, so no state is kept for next run. Composers CD mode will kept the deployment and this can bypass previous requirement.

This use case is also important to check usage of kubernetes (the next composer JePL will support) for the mentioned CI and CD modes.

BorjaEst commented 3 years ago

Code integration should run on different pipelines, so any of the tests on each subfolder-service (backend, frontend, etc) should have their independent CI Pipeline and results. For example a "failure" on backend should not interrupt the tests for the other services.

However, the common tests at the root folder, would need the services up and running to preform the tests. Also I do not think it has sense to run them if any of the tests from the relying services failed. Probably I should have express better and do not use "more_tests" but maybe "deployment_checks/tests" so it is more clear that those tests ensure the deployment is going to be OK. At the end, this is more about "service" integration rather than "code" integration.

Regarding "docker-compose.yaml". I put that file as an example for our web application however, now that you mention, it should not be considered into the piplene. Your approach should be compatible with other frameworks as "kubernetes" or "mesos". However your question is really fair, who/what is responsible of bringing up the services?

Considering now the root tests are part of CD, then I do not have too much experience on what would be the best approach to define them. As a first research, I see there are popular frameworks, like robotframework, which are language agnostic and their setup might be in charge of bringing the services up. However, it would be nice to have also the feedback from the community in this point.

For now I need more time to reach that side on our project to provide you better feedback on the root tests.