andreas-31 / cicd-automated-testing

CI/CD pipeline for automated testing in Azure Pipelines.
0 stars 0 forks source link

Ensuring Quality Releases

CI/CD pipeline in Azure DevOps for automated building, deployment, testing, monitoring, and logging.

Build Status

Project Overview

Azure DevOps CI/CD Pipeline drives various services and tools for automated building, deployment, testing, monitoring, and logging
Azure DevOps CI/CD Pipeline drives various services and tools for automated building, deployment, testing, monitoring, and logging.

Azure DevOps

The subsequently described CI/CD pipeline is defined in the YAML file azure-piplines.yaml that is contained in this GitHub repository. The declarations in the YAML file are evaluated by Azure Pipelines that is part of Azure DevOps. When the pipeline runs, the system begins to run one or more jobs on Azure Pipelines Agents. An agent is computing infrastructure with installed agent software that runs one job at a time. All except one pipeline steps are executed on Microsoft-hosted agents. The functional UI tests are run on self-hosted agent. It is an Ubuntu 18.04 virtual machine that is created by Terraform and was added to an Azure Pipelines environment called "TEST". In summary, the following configurations have been made in Azure Pipelines:

Description of the CI/CD Pipeline

Structure of the CI/CD Pipeline

Azure Pipelines showing the two stages of the CI/CD pipeline: Build and Deployment
Azure Pipelines showing the two stages of the CI/CD pipeline: Build and Deployment.
Azure Pipelines showing jobs for each stage
Azure Pipelines showing jobs for each stage.

Build Stage

The Build job in the Build stage performs several steps:

Terraform has been configured to store state remotely in Azure storage account by following this tutorial: Store Terraform state in Azure Storage. The parameters for configuring the remote Terraform backend are defined as variables in azure-piplines.yaml (resource_group_name, storage_account_name, container_name, key) or queried at pipeline runtime with Azure CLI (access_key for Azure storage account).

Azure Storage Explorer showing Terraform state (.tfstate) files stored in Azure storage account
Azure Storage Explorer showing Terraform state files (.tfstate) files stored in Azure storage account.
All Azure resources that are created by Terraform are contained in the Azure resource group cicd-automated-testing-rg. Azure resources created by Terraform
Azure resources created by Terraform.

Build Fake REST API Artifact (ZIP File)

An ASP.NET application that implements a Fake REST API is stored in the subdirectory automatedtesting/jmeter/fakerestapi within this GitHub repository. That subdirectory is zipped and the ZIP file is stored as artifact in Azure DevOps. The application package will later be deployed in the deployment stage to Azure App Services.

API Integration Tests with Newman (Postman)

Newman is installed via npm (Node.js package manager). The API Regression Test Suite and the API Data Validation Test Suite are located in the subdirectory automatedtesting/postman and are run with Newman. The results of both test suites are published into Azure DevOps Test Hub where the results are summarized and visualized.

The API Regression Test Suite checks all API endpoints for successful response status codes and messages: Regression Testcase API Endpoint
R1 Get All Employees http://dummy.restapiexample.com/api/v1/employees
R2 Get Single Employee http://dummy.restapiexample.com/api/v1/employee/{{id}}
R3 Create Employee http://dummy.restapiexample.com/api/v1/create
R4 Update Employee http://dummy.restapiexample.com/api/v1/update/{{id}}
R5 Delete Employee http://dummy.restapiexample.com/api/v1/delete/{{id}}

The API Regression Test Suite is run in the pipeline with this command:

newman run "automatedtesting/postman/API Regression Test Suite.postman_collection.json" --env-var newSalary="456" --environment automatedtesting/postman/DummyRestApiEnvironment.postman_environment.json --reporters cli,junit --reporter-junit-export newmanResults/junitReport-regressionTests.xml
The API Data Validation Test Suite first creating employee data and then validates that the employee data has been correctly provided by the web application: Data Validation Testcase API Endpoint
V1 Create Employee Data http://dummy.restapiexample.com/api/v1/create
V2 Validate Employee Data http://dummy.restapiexample.com/api/v1/employee/{{newId}}

The API Data Validation Test Suite is run in the pipeline with this command:

newman run "automatedtesting/postman/API Data Validation Test Suite.postman_collection.json" --environment automatedtesting/postman/DummyRestApiEnvironment.postman_environment.json --iteration-data automatedtesting/postman/Dummy-REST-API-Data.csv --reporters cli,junit --reporter-junit-export newmanResults/junitReport-dataValidationTests.xml

The Publish Test Results task is used to publish the results of the regression and data validation test runs in "JUnit" format to Azure DevOps Test Plans.

Azure DevOps Test Plans: test run summary is shown for Newman Regression Test Suite
Azure DevOps Test Plans: test run summary is shown for Newman Regression Test Suite.
Azure DevOps Test Plans: test run summary is shown for Newman Data Validation Test Suite
Azure DevOps Test Plans: test run summary is shown for Newman Data Validation Test Suite.

Deployment Stage

The following jobs are run in the Deployment stage:

Deploy and Run Selenium on Ubuntu VM

Prepare Environment and Run Selenium

Functional UI test are defined and run in Python scripts by using the Selenium WebDriver Python module and the chromedriver driver for controlling the Chromium web browser in non-GUI mode.

The test website saucedemo.com is used for running functional UI tests like logging into the webshop, adding items to the shopping cart, and then removing items again. After adding or removing an item, it is verified that the item is really in or no longer in the shopping cart, respectively.

The Python scripts provides information of its status by writing log messages to the CLI as well as a Selenium logfile in CSV format: see exemplary Selenium logfile in CSV format. The following data fields are logged in the CSV file:

Azure Log Analytics: Ingest Selenium Logfile

The Selenium CSV logfile is ingested into an Azure Log Analytics workspace by the Python script ingest_logs_into_azure_monitor.py. It reads the CSV file line by line and sends the log messages in POST request to Azure Log Analytics. This scripted approach to ingest logs was chosen because the Log Analytics agent for Linux although installed as documented did not report any data at all, neither standard (e.g. syslog) nor custom logs. After ingestion of the logs, they can be searched and analyzed with Kusto queries. The page for entering Kusto queries can be opened by clicking on "View Logs" link on the "Overview" page in Log Analytics workspace.

Kusto query for displaying all Selenium logs related to removed shopping cart items sorted by time generated
Kusto query for displaying all Selenium logs related to removed shopping cart items sorted by time generated.

Run Load Tests With JMeter and Publish HTML Test Reports

In this step, OpenJDK Java Runtime Environment is installed and JMeter tool is downloaded to the Azure Pipelines agent machine. Then, the Endurance Test Plan and the Stress Test Plan are run.

Endurance Test Plan

The Endurance Test Plan is designed to send 6 HTTP requests every 10 seconds for 1 minute by using these settings:

JMeter Endurance Test Plan: Constant Throughput Timer settings
JMeter Endurance Test Plan: Constant Throughput Timer settings.
JMeter Endurance Test Plan: Thread Groups settings
JMeter Endurance Test Plan: Thread Groups settings.
JMeter Endurance Test Plan: Parameters and JSON body values for GET, POST, and PUT requests are read from CSV files
Parameters and JSON body values for GET, POST, and PUT requests are read from CSV files.
Stress Test Plan

The Stress Test Plan is designed to send many HTTP requests in a short amount of time by using these settings:

JMeter Stress Test Plan: "GET All" queries executed by 30 users over 30 seconds
JMeter Stress Test Plan: "GET All" queries executed by 30 users per thread group (amounts to 90 users in total) over 30 seconds.
Azure Pipelines: Publishing of JMeter Test Reports
Azure Pipelines: list of pipeline artifacts showing zipped JMeter test reports
Azure Pipelines: list of pipeline artifacts showing zipped JMeter HTML reports for endurance and stress tests.
JMeter Endurance Test Report (HTML format)
JMeter Endurance Test Report (HTML format).
JMeter Stress Test Report (HTML format)
JMeter Stress Test Report (HTML format).
Azure Monitor: Send Alarms by Email

Azure Monitor alarms have been configured for "cicd-app-AppService" (Fake REST API webapp):

Azure Monitor: alarms configured for App Service "cicd-app-AppService"
Azure Monitor: alarms configured for App Service "cicd-app-AppService".
Azure Monitor: graph showing that alarm "AppServiceCPU_TimeAlert" did fire because maximum CPU time exceeded 30 seconds
Azure Monitor: graph showing that alarm "AppServiceCPU_TimeAlert" did fire because maximum CPU time exceeded 30 seconds.
Azure Monitor: graph showing that alarm "AppServiceHandleCountAlert" did fire because maximum handle count exceeded 50
Azure Monitor: graph showing that alarm "AppServiceHandleCountAlert" did fire because maximum handle count exceeded 50.
Azure Monitor: email notification for alert "AppServiceCPU_TimeAlert, Page 1"
Azure Monitor: email notification for alert "AppServiceCPU_TimeAlert", Page 1.
Azure Monitor: email notification for alert "AppServiceCPU_TimeAlert, Page 2"
Azure Monitor: email notification for alert "AppServiceCPU_TimeAlert", Page 2.
Azure Monitor: email notification for alert "AppServiceHandleCountAlert", Page 1
Azure Monitor: email notification for alert "AppServiceHandleCountAlert", Page 1.
Azure Monitor: email notification for alert "AppServiceHandleCountAlert", Page 2
Azure Monitor: email notification for alert "AppServiceHandleCountAlert", Page 2.