NASA-AMMOS / slim

Software Lifecycle Improvement & Modernization
https://nasa-ammos.github.io/slim/
Apache License 2.0
27 stars 9 forks source link

[New Best Practice Guide]: Continuous Testing Guide and Checklist #110

Closed riverma closed 3 months ago

riverma commented 1 year ago

Checked for duplicates

Yes - I've already checked

Describe the needs

We have existing tickets and material related to testing. It would be great to converge these (and add missing capabilities) such that projects can consult a single authoritative guide to understand the architecture and checklist items they should implement for doing continuous testing "right".

Imagine having a recommendation / guide including the following:

See some of our existing tickets:

Existing material:

riverma commented 1 year ago

Folks like @GodwinShen @mike-gangl @pramirez-nasa are interested in this for the Unity Project.

yunks128 commented 1 year ago

I looked into the testing frameworks page (https://nasa-ammos.github.io/slim/docs/guides/software-lifecycle/continuous-testing/testing-frameworks), and there are several additional testing frameworks and tools that might be helpful. We may consider organizing the pros and cons of each framework so that we help users evaluate these tools in the context of their application and development stack to determine which ones are the most suitable for their testing needs.

For Mocking and Stubbing:

For Code Analysis and Static Code Checks:

For Security Testing:

For Performance Testing:

For Mobile Testing:

For Test Automation Frameworks:

For Cross-Browser Testing:

For Continuous Integration and Deployment:

For Containerization and Orchestration:

For AI/ML Testing:

riverma commented 1 year ago

Very nice @yunks128 - excellent ideas on suggested tools. Would you consider improving our list of recommended CT tools by making a PR for this in https://github.com/NASA-AMMOS/slim/blob/main/docs/guides/software-lifecycle/continuous-testing/testing-frameworks.md ?

yunks128 commented 1 year ago

Key phases that are important within a continuous testing pipeline:

  1. Test Planning and Design:

    • Define testing objectives and scope.
    • Create test plans, test cases, and test data.
    • Design test environments and configurations.
  2. Test Automation:

    • Develop automated test scripts for various testing levels (unit, integration, regression, etc.).
    • Implement and maintain test automation frameworks.
  3. Continuous Integration (CI):

    • Trigger automated tests on code commits to the version control system.
    • Ensure the code is automatically built and integrated into a test environment.
  4. Unit Testing:

    • Test individual components or modules in isolation.
    • Verify that each unit of code functions as expected.
  5. Integration Testing:

    • Validate interactions between integrated components.
    • Ensure that different parts of the application work together seamlessly.
  6. Regression Testing:

    • Run automated tests to identify regressions or unintended side effects.
    • Ensure that new code changes do not break existing functionality.
  7. Performance Testing:

    • Assess the software's responsiveness, scalability, and resource usage.
    • Detect and address performance bottlenecks.
  8. Security Testing:

    • Identify vulnerabilities and security risks in the software.
    • Conduct security scans, penetration testing, and code reviews.
  9. User Interface (UI) Testing:

    • Evaluate the user interface for usability and consistency.
    • Validate that the application's UI meets design and functionality requirements.
  10. Exploratory Testing:

    • Perform ad-hoc testing to discover defects and issues not covered by automated tests.
    • Mimic user behavior to uncover usability and functional issues.
  11. Continuous Deployment (CD):

    • Automatically deploy tested code to production or staging environments.
    • Ensure that deployment is reliable and rollback mechanisms are in place.
  12. Monitoring and Feedback:

    • Continuously monitor the production environment for performance and security issues.
    • Gather feedback from users and stakeholders to inform future testing efforts.
  13. Reporting and Analysis:

    • Generate test reports and dashboards to communicate test results.
    • Analyze test data to identify trends and areas for improvement.
  14. Test Environment Management:

    • Provision and manage test environments to mirror production.
    • Ensure that test environments are consistent and up-to-date.
  15. Maintenance and Test Data Management:

    • Maintain and update test automation scripts as the application evolves.
    • Manage and refresh test data to ensure relevance and accuracy.
  16. Continuous Improvement:

    • Review and refine testing processes based on lessons learned.
    • Identify opportunities for test optimization and efficiency.

Are there any missing phases throughout the development lifecycle? Are there other considerations for high-quality software with reduced risks and faster release cycles?

jpl-jengelke commented 1 year ago

Key phases that are important within a continuous testing pipeline: ... Are there any missing phases throughout the development lifecycle? Are there other considerations for high-quality software with reduced risks and faster release cycles?

I would move (13) after (15) and also move items (2), (3) and (11) [in that order] after (16). The reason for this change is that anything CI/CD does not work until after some kind of testing is established. Reporting really comes after tests. The greatest challenge I have noted in projects isn't CI/CD but actually the dearth of tests in all areas. CI/CD doesn't work without any tests.

Note that continuous testing exists without CI/CD. (Example: Build systems like Maven or Make that run tests during development builds.) CI/CD is a mechanism to test combined code bases that promotes visibility to external observers. I'm also considering the extent to which CI/CD should be included here so as not to conflate Continuous Testing and Continuous Integration.

yunks128 commented 1 year ago

Continuous testing key phases in 5 steps (updated: removed CI/CD, merged 14 steps)

  1. Planning and Design: Define testing objectives, create test plans, design environments, and develop test cases.

  2. Test Execution: Execute automated tests for unit, integration, regression, performance, security, and UI.

  3. Feedback and Monitoring: Continuously monitor production, gather user feedback, and address issues.

  4. Environment and Data Management: Manage test environments and data to maintain consistency and relevance.

  5. Improvement and Reporting: Review and refine processes, analyze results, and communicate findings to optimize testing efforts.

yunks128 commented 1 year ago

A list of tools and frameworks commonly used for each of the 5 steps in continuous testing:

1. Planning and Design:

2. Test Execution:

3. Feedback and Monitoring:

4. Environment and Data Management:

5. Improvement and Reporting:

yunks128 commented 1 year ago

Continuous testing steps are consolidated into three steps:

1. Planning and Design:

2. Test Implementation:

3. Improvement and Reporting:

riverma commented 1 year ago

@yunks128 - to add a bit more about our discussion about a Test Plan Template to offer to projects, that helps them plan for the above workflow with specific action steps they can communicate with their teams. I've put together a draft template and an example for the Unity U-CS project we can use to iterate further. See the below. Thoughts? Are we missing material in the template that is important for drafting a common set of guidelines for continuous testing? CC @NASA-AMMOS/slim-tsc @NASA-AMMOS/slim-community

Template

### [INSERT PROJECT NAME HERE] Continuous Testing Plan

#### Introduction:
This document aims to provide a general approach to Continuous Testing for [INSERT PROJECT NAME HERE]. It encompasses planning, test phases, tool recommendations, and test specifications.

---

#### **1. Project Overview**
- **Project Name:** [INSERT PROJECT NAME HERE]
- **Project Description:** [INSERT SHORT PROJECT DESCRIPTION HERE]
- **Testing Lead:** [INSERT PROJECT LEAD NAME HERE]

#### **2. Test Requirements**
- **Objective:** Why are we testing? [INSERT OBJECTIVE HERE]
  <!-- Example: "Ensure that the codebase remains free of critical vulnerabilities and maintains high performance." -->

- **Test Artifacts:** What are we testing? [INSERT ARTIFACTS HERE]
  <!-- Example: "Web APIs, Front-end components, Database layer." -->

#### **3. Testing Workflow Architecture**

```mermaid
graph TD;
    A[Code]
    B
    C

    A --> B;
    B --> C;

    subgraph B[Local Testing Venue]
        D[Unit Testing]
        E[Security Testing]
        F[...]
    end

    subgraph C[Integration Testing Venue]
        G[Regression Testing]
        H[System Testing]
        I[Performance Testing]
        J[Security Testing]
        K[Requirements Verification & Validation]
        L[Deployment Testing]
        M[...]
    end

4. Test Specifications

Before customizing the table entries below, evaluate your project's specific needs and requirements. Add, remove, or modify rows in the table to best represent the phases, tests, tools, and people associated with your testing approach.

Phase High-level Tests Recommended Tool Description Starter Kit Key People
Security Testing Vulnerability Scan GitHub Dependabot Monitors dependencies for known vulnerabilities. GitHub Docs [INSERT NAME HERE]
Secure Code Review SonarQube Continuous inspection of code quality. SonarQube Docs [INSERT NAME HERE]
Unit Testing Function Validity JUnit Verifies individual units of Java software. JUnit 5 Guide [INSERT NAME HERE]
Null Checks xUnit Unit testing tool for .NET. xUnit.net Docs [INSERT NAME HERE]
Regression Testing Feature Consistency Cucumber Supports behavior-driven development (BDD). Cucumber Starter [INSERT NAME HERE]
User Flow Validation Selenium Ensures that new code changes do not adversely affect existing functionalities. Selenium HQ [INSERT NAME HERE]
Integration Testing API Contract Validation REST Assured Java DSL for simplifying testing of REST based services. REST Assured Guide [INSERT NAME HERE]
Data Flow Checks Postman Validates the interfaces and interactions between different software modules. Postman Learning Center [INSERT NAME HERE]
Performance Testing Load Testing JMeter Measures system performance under various conditions. JMeter User Manual [INSERT NAME HERE]
Requirements Verification & Validation Requirement Traceability TestRail Ensures that the system meets the defined requirements. TestRail Docs [INSERT NAME HERE]
Deployment Testing Cloud Deployment Terraform Infrastructure as code for cloud provisioning. Terraform Get Started [INSERT NAME HERE]
Chaos Testing Chaos Monkey Simulates random failures to test system resilience. Chaos Monkey Wiki [INSERT NAME HERE]

#### **5. Implementation Checklist**
- [ ] Continuous Testing Plan (this document) defined, including all relevant parts
- [ ] Security Testing tools, people, and tests implemented
- [ ] Unit Testing tools, people, and tests implemented
- [ ] ...
- [ ] All planned test cases defined in test plan are implemented.
- [ ] Defects are reported and tracked.
- [ ] Test summary report is generated and shared with stakeholders.

## Example of using the template for U-CS

---
### Unity Common Services (U-CS) Continuous Testing Plan

#### Introduction:
This document aims to provide a general approach to Continuous Testing for Unity Common Services (U-CS) within the Unity project. It encompasses planning, test phases, tool recommendations, and test specifications.

---

#### **1. Project Overview**
- **Project Name:** Unity Common Services (U-CS)
- **Project Description:** U-CS is a service area within the Unity project, providing functionalities including Building and Publishing Artifacts, Deployment, Continuous Integration, Management & Monitoring, Cloud Costs and Budgeting, Security, Authentication, and Authorization. Unity is a next-generation, service-based science data system developed at NASA's Jet Propulsion Laboratory.
- **Testing Lead:** Galen Hollins

#### **2. Test Requirements**
- **Objective:** Ensure the robustness, security, and efficiency of U-CS services, while maintaining a seamless integration with the Unity project's core functionalities.
  <!-- Example: "Ensure that the codebase remains free of critical vulnerabilities and maintains high performance." -->

- **Test Artifacts:** U-CS services including Deployment, Continuous Integration, Management & Monitoring, and Security.
  <!-- Example: "Web APIs, Front-end components, Database layer." -->

#### **3. Testing Workflow Architecture**

```mermaid
graph TD;
    A[Code]
    B
    C

    A --> B;
    B --> C;

    subgraph B[Local Testing Venue]
        D[Unit Testing]
        E[Security Testing]
        F[...]
    end

    subgraph C[Integration Testing Venue]
        G[Regression Testing]
        H[System Testing]
        I[Performance Testing]
        J[Security Testing]
        K[Requirements Verification & Validation]
        L[Deployment Testing]
        M[...]
    end

4. Test Specifications

Before customizing the table entries below, evaluate your project's specific needs and requirements. Add, remove, or modify rows in the table to best represent the phases, tests, tools, and people associated with your testing approach.

Phase High-level Tests Recommended Tool Description Starter Kit Key People
Security Testing Vulnerability Scan GitHub Dependabot Monitors dependencies for known vulnerabilities. GitHub Docs [INSERT NAME HERE]
Secure Code Review SonarQube Continuous inspection of code quality. SonarQube Docs [INSERT NAME HERE]
Unit Testing Function Validity JUnit Verifies individual units of Java software. JUnit 5 Guide [INSERT NAME HERE]
Null Checks xUnit Unit testing tool for .NET. xUnit.net Docs [INSERT NAME HERE]
Regression Testing Feature Consistency Cucumber Supports behavior-driven development (BDD). Cucumber Starter [INSERT NAME HERE]
User Flow Validation Selenium Ensures that new code changes do not adversely affect existing functionalities. Selenium HQ [INSERT NAME HERE]
Integration Testing API Contract Validation REST Assured Java DSL for simplifying testing of REST based services. REST Assured Guide [INSERT NAME HERE]
Data Flow Checks Postman Validates the interfaces and interactions between different software modules. Postman Learning Center [INSERT NAME HERE]
Performance Testing Load Testing JMeter Measures system performance under various conditions. JMeter User Manual [INSERT NAME HERE]
Requirements Verification & Validation Requirement Traceability TestRail Ensures that the system meets the defined requirements. TestRail Docs [INSERT NAME HERE]
Deployment Testing Cloud Deployment Terraform Infrastructure as code for cloud provisioning. Terraform Get Started [INSERT NAME HERE]
Chaos Testing Chaos Monkey Simulates random failures to test system resilience. Chaos Monkey Wiki [INSERT NAME HERE]

5. Implementation Checklist

tariqksoliman commented 1 year ago

As an alternative to Selenium and while I have not used it myself (yet), I know of other teams who have and who have praised https://github.com/microsoft/playwright (primarily JS but there's support for testing Java and Python too).

yunks128 commented 1 year ago

@riverma Thanks for the testing plan template! It looks great. I would suggest adding the following:

  1. Acceptance Criteria
    • All planned test cases are executed.
    • Defects are reported and tracked.
    • Test summary report is generated and shared with stakeholders.
Scotchester commented 1 year ago

Unit Testing:

  • JUnit, TestNG (Java), xUnit (Python), NUnit (C#): For unit testing.

Sorry for the drive-by comment when I have been unable to stay involved in SLIM over the past year, but I don't believe that xUnit works with Python. Perhaps pytest would be a good recommendation? Here's a snippet from the 2022 JetBrains Python Developers Survey:

Graph of preferred unit testing framework survey data, with pytest at 51%, unittest at 24% and a smattering of other options all at 10% or less

riverma commented 1 year ago

Great recommendations @Scotchester and @tariqksoliman! I agree @yunks128 - we'll need a second pass at some tool recommendations based on:

It'd be great to augment your developer poll and conversation materials with the latest on the above before sharing.

ddalton-swe commented 1 year ago

pre-commit hooks serve as a great supplement to basic scanning and testing. They are also fairly project-agnostic and SLIM could provide a template for basic pre-commit hooks for projects

riverma commented 1 year ago

pre-commit hooks serve as a great supplement to basic scanning and testing. They are also fairly project-agnostic and SLIM could provide a template for basic pre-commit hooks for projects

@ddalton212 - great idea. We're thinking (depending on the test) that the idea of "testing venue" can convey the need to do some tests on the developer's machine. We've mentioned that in the architecture template above. Sometimes we may want to do both local and server-side - for example if a developer uses GitHub.com's codespaces to make a commit instead of a local terminal a pre-commit hook installed only locally might miss it.

Do you have favorite pre-commit hooks you recommend?

jpl-jengelke commented 1 year ago

It's difficult to provide a comprehensive list of software recommendations that is consistently updated that we're also expert in using. I found that the OWASP DevSecOps Guideline provides a fairly consistent and technology independent list of software products applicable across various areas of testing concerns.

For instance, their list of static code analysis tools are presented here. (This is only one list of several testing area tools in the guide.)

Perhaps we should consider listing our preferred tools and then defer to something like OWASP for the more comprehensive list.

ddalton-swe commented 1 year ago

pre-commit hooks serve as a great supplement to basic scanning and testing. They are also fairly project-agnostic and SLIM could provide a template for basic pre-commit hooks for projects

@ddalton212 - great idea. We're thinking (depending on the test) that the idea of "testing venue" can convey the need to do some tests on the developer's machine. We've mentioned that in the architecture template above. Sometimes we may want to do both local and server-side - for example if a developer uses GitHub.com's codespaces to make a commit instead of a local terminal a pre-commit hook installed only locally might miss it.

Do you have favorite pre-commit hooks you recommend?

There's a few (https://pre-commit.com/hooks.html). I have implemented the Terraform hooks for some projects. I also enjoy using something like Ruff for Python linting. It can be a helpful bottom layer that prevents broken commits from entering the codebase. Of course, they aren't perfect and can be ignored or cheated if a developer is motivated enough. If that's the case it may be more helpful to have a job that runs this linting in Jenkins or GitHub Actions.

ingyhere commented 8 months ago

Added trade study requirement pulled from scope in #25, since it is a more logical place than a ticket for a specific tool (SCRUB) demonstration.

riverma commented 3 months ago

Resolved in https://github.com/NASA-AMMOS/slim/releases/tag/v1.5.0