Closed PaulMRamirez closed 3 months ago
@ewferg is also interested in this issue and came up as part of the AMMOS Developer summit as noted by @riverma. This could potentially qualify as a best practice guild also however first understanding what metrics we could collect and how seems to be the first step.
CC @ddalton212 - I wonder if your experience with Apache DevLake would be relevant here. Do you think DevLake could help with the above metrics?
@riverma I do think that we could use Apache DevLake to track some of these. Especially if we can wrap these metrics into the main DORA metrics
There are many tools for tracking these metrics. We may want to do some sort of trade study on the best tool and then try to test that with some open source project.
@riverma In addition to DevOps metrics, code quality can be quantified by SonarQube, including complexity, duplications, maintainability, reliability, security, and size. These metrics may be relevant to cost estimation and planning/scheduling of capabilities that @PaulMRamirez mentioned.
@riverma In addition to DevOps metrics, code quality can be quantified by SonarQube.
SonarQube provides a lot of code detail, and it's great. I don't believe it provides all the types of detail used in DORA metrics, like collating time to deploy, test metrics, etc. But we may simply be good to start with the SonarQube metrics ...
I would agree that SonarQube is a great start for metrics collection. It can also be integrated with GitHub actions. DORA would be there as a general overview of software performance
@PaulMRamirez I'll be mapping the DORA metrics to the metrics listed above:
Metrics to drive cost estimation and planning/scheduling of capabilities - All 4 metrics
How such metrics be collected and useful to developers and managers of said products - Metrics Usage
Is there a benefit to said labeling of issues to the open source community if driven by developers (high, medium, low complexity) that would be related to cost metrics - These can play into Change Lead Time
Metrics that track product usage and community engagement - This could be tracked within DevLake if needed
Metrics that give users a sense of product characteristics (performance in different environments, code size, etc.) - This would be another use case for Software Delivery Performance Metrics
Here is a preview of the DORA metrics guide.
Measuring the performance of software delivery is crucial for development teams and organizations. Accurate and reliable metrics provide insights into how efficiently and effectively software is developed, tested, and deployed. One of the most widely respected methodologies for assessing software delivery performance is the DORA (DevOps Research and Assessment) metrics.
This README provides an overview of what DORA metrics are and why they are considered the best way to measure software delivery performance.
DORA metrics, introduced by the team behind the State of DevOps Report, are a set of key performance indicators (KPIs) used to evaluate the efficiency and effectiveness of software development and delivery processes. These metrics were developed through extensive research and are widely recognized as industry standards for evaluating DevOps and software delivery practices.
The core DORA metrics include:
Lead Time for Changes: This metric measures the time it takes to go from code commit to the deployment of that code to production. It helps assess how quickly new features and fixes are delivered to users.
Deployment Frequency: This metric indicates how often code changes are deployed to production. Frequent deployments can lead to more responsive and agile software development practices.
Change Failure Rate: The change failure rate assesses the percentage of changes that result in service disruptions or defects after deployment. Lower failure rates are indicative of more reliable software delivery processes.
Mean Time to Recovery (MTTR): MTTR measures how quickly an organization can recover from incidents or outages. Faster recovery times reduce downtime and minimize the impact of failures.
DORA metrics have gained widespread recognition as the best way to measure software delivery performance for several compelling reasons:
Empirical Validation: DORA metrics have been rigorously researched and empirically validated. The State of DevOps Report, based on a large dataset of real-world organizations, demonstrates a strong correlation between high-performing organizations and their adherence to these metrics.
Focus on Key Outcomes: DORA metrics emphasize outcomes that matter most to both development teams and business stakeholders. They address lead time, deployment frequency, and failure rates, which are essential for achieving agile, reliable, and responsive software delivery.
Alignment with DevOps Principles: DORA metrics align with the core principles of DevOps, promoting collaboration, automation, and a continuous improvement mindset. By using these metrics, teams can assess their progress in adopting DevOps practices.
Practical and Actionable: These metrics are not just theoretical benchmarks but provide actionable insights. Teams can use them to identify areas for improvement and track the impact of changes in their delivery processes.
Benchmarking and Peer Comparison: DORA metrics enable organizations to benchmark themselves against industry standards and compare their performance with peer organizations. This helps set realistic goals and identify areas for improvement.
Management and Leadership Buy-In: DORA metrics are designed to communicate software delivery performance to management and leadership in a clear and understandable way. This fosters support for improvements and investments in the development process.
To measure software delivery performance using DORA metrics, follow these steps:
Analyze and Interpret: Calculate the values of these metrics and interpret the results. Identify areas where improvements are needed.
Set Goals: Set specific, measurable goals for each metric. Define targets that are relevant to your organization's context and objectives.
Implement Changes: Make changes to your development and delivery processes to improve your performance on these metrics. This may involve adopting DevOps practices, automating tasks, and improving collaboration.
Regularly Monitor and Adjust: Continuously monitor and track your performance against the chosen DORA metrics. Adjust your strategies and practices as needed to meet your goals.
DORA metrics are widely recognized as the best way to measure software delivery performance. They provide empirical, actionable, and meaningful insights into the efficiency and effectiveness of software development and delivery processes. By focusing on lead time, deployment frequency, change failure rate, and MTTR, organizations can align their practices with DevOps principles, drive continuous improvement, and deliver high-quality software to their users.
For further guidance and detailed benchmarking data, refer to the State of DevOps Report and other resources provided by the DORA team.
@ddalton-jpl The DORA Metrics guide looks comprehensive and informative! One suggestion would be to accompany the tool list with a brief description of their capabilities and unique features to help the readers understand their suitability. (That might be added after a trade study.)
@ddalton-jpl - great work here! Good to know there's strong industry backing for this approach.
I think the value added for this ticket could be to (1) finalize a common set of metrics that intersects with your experience with DORA and the metrics mentioned in this ticket / expressed among the SLIM community, (2) offering a starter kit that implements those metrics and makes it easy to get started for projects with a common configuration.
What do you think? If you like that approach - can we draft a list for (1) and offer thoughts for (2)?
@riverma would it be valuable for me to run a trade study before some of the other steps? I think I would need to narrow it down to 3-4 and provide a table with the pros and cons of each before settling on one for a starter kit
@riverma would it be valuable for me to run a trade study before some of the other steps? I think I would need to narrow it down to 3-4 and provide a table with the pros and cons of each before settling on one for a starter kit
@ddalton-jpl sounds great - a trade-study would be helpful for folks to understand the best tool of choice for the metrics we need. You may want to map the metrics that were mentioned by @PaulMRamirez in this ticket as criteria within the trade study (how well / easily they can be captured).
BTW - do check out https://chaoss.community for options as well to include in your study. They look more at the health of OSS communities but they have many recommendations for metric collection.
@riverma here is the trade study. I tried to be generic with the criteria while also mentioning what @PaulMRamirez wanted.
This README outlines a trade study conducted to compare various free and open-source tools for tracking DevOps Research and Assessment (DORA) metrics.
The primary objective of this trade study is to assess and compare different free and open-source DORA metrics tracking tools.
This trade study will focus on comparing a selection of popular free and open-source DORA metrics tracking tools.
Selection of Open-Source Tools: A list of relevant free and open-source DORA metrics tracking tools was compiled based on popularity, community recommendations, and industry recognition.
Evaluation Criteria: A set of criteria was established to assess the open-source tools. These criteria will be detailed in the "Evaluation Criteria" section.
Assessment: Each open-source tool was tested and evaluated against the defined criteria. This involved installing and configuring the tools, exploring their functionality, and assessing their performance.
Documentation and Review: A comprehensive review of the open-source tools were documented, including their strengths, weaknesses, and notable features.
Analysis: The findings and insights from the assessment were analyzed to make informed recommendations.
The following free and open-source DORA metrics tracking tools were evaluated in this study:
Tool Name | Description |
---|---|
Apache DevLake | Apache DevLake is an open-source dev data platform that ingests, analyzes, and visualizes the fragmented data from DevOps tools to extract insights for engineering excellence, developer experience, and community growth. |
ThoughtWorks Metrik | For development teams who want to measure their software delivery and operational (SDO) performance, this is a tool that helps them collect data from CD pipelines and visualize the key metrics in a friendly format. |
FourKeys | Platform for monitoring the four key software delivery metrics of software delivery |
For the evaluation of DORA metrics tracking tools, we will consider the following criteria:
Ease of Installation and Configuration: How long does it take to set up the tool? 1(longest) - 5(fastest)
Integration with Other Tools: How well does the open-source tool integrate GitHub, Jenkins, JIRA, etc.? 1(poor integration) - 5(excellent integration)
Community and Support: What is the level of community and vendor support available for the open-source tool? 1(little support) - 5(fully supported)
Customization and Extensibility: To what extent can the open-source tool be customized and extended to fit specific requirements like tracking metrics for product usage and community engagement? 1(no customization) - 2(fully customizable)
Tool Name | Ease of Installation and Configuration | Integration with Other Tools | Community and Support | Customization and Extensibility | Total Score |
---|---|---|---|---|---|
Apache DevLake | 5 | 5 | 5 | 5 | 20 |
ThoughtWorks Metrik | 3 | 4 | 2 | 3 | 12 |
FourKeys | 2 | 3 | 3 | 2 | 10 |
In the analysis section, we will delve into the key findings and observations from the evaluation of the DORA metrics tracking tools
Ease of Installation and Configuration: Apache DevLake scored the highest in this category, indicating that it is relatively straightforward to set up and configure.
Integration with Other Tools: It excels in integrating with other tools, including GitHub, Jenkins, and JIRA, which demonstrates its ability to seamlessly fit into existing DevOps ecosystems.
Community and Support: Apache DevLake boasts strong community and vendor support.
Customization and Extensibility: Apache DevLake provides extensive customization options. This allows for customizable metrics tracking.
Ease of Installation and Configuration: While not the fastest to set up, ThoughtWorks Metrik offers a reasonable installation and configuration process.
Integration with Other Tools: It offers decent integration capabilities with other tools, which can be enhanced to improve its compatibility with a wider range of DevOps tools.
Community and Support: ThoughtWorks Metrik's community and vendor support are relatively modest, and this may pose challenges for teams seeking robust support options.
Customization and Extensibility: This tool provides some room for customization but is not as flexible as Apache DevLake in this regard.
Ease of Installation and Configuration: FourKeys falls behind in terms of installation and configuration, which might require additional time and effort.
Integration with Other Tools: It offers moderate integration with other tools, and there is room for improvement to enhance its compatibility further.
Community and Support: FourKeys has a decent level of community and vendor support but may not be as well-supported as Apache DevLake.
Customization and Extensibility: It provides a moderate level of customization and extensibility, making it a suitable choice for teams with relatively standard requirements.
Apache DevLake stands out as a strong candidate, particularly for easy installation, robust integration, strong support, and high customization potential. ThoughtWorks Metrik offers a reasonable alternative with a balanced set of features, though it may require additional effort for integration and has more limited support. FourKeys, while a viable option, may need improvement in installation configuration and integration aspects.
Based on the evaluation results, we offer the following recommendations:
Teams seeking a comprehensive DORA metrics tracking tool with strong integration capabilities, support, and customization potential should consider Apache DevLake as their top choice.
ThoughtWorks Metrik can be a suitable option for teams with moderate requirements and a willingness to invest additional effort in integration and customization.
FourKeys may be a suitable choice for teams with more basic requirements and a preference for moderate installation and configuration complexity.
Reading through this it appears that we would want to think through the appropriate tooling and areas of products metrics, process metrics, project metrics, and people metrics as described here. For example https://radixweb.com/blog/software-engineering-metrics. Leaning towards ease of integration and understanding and analysis of metrics collected.
@ddalton-jpl It appears the DORA answers the process metrics area. @yunks128 SonarQube would fit into to product metrics.
The other metrics of interest is community which @riverma referred to https://chaoss.community/.
It would be nice to write these up at a high level and what tools are available in these general areas. In addition, coming up with a characteristics to report on about each tool identified.
@PaulMRamirez thanks for the feedback. As a starting point, would a focus on DORA metrics would be beneficial? This approach doesn't capture all of the metrics but it does provide us with a starting point that we can then expand upon. If we can limit the scope to those metrics I could have a deeper look at implementation and develop a starter kit.
@ddalton-jpl From my reading the DORA metrics seem to be focused on the metrics related to get something into a production setting. I don't think there is enough coverage of CI/CD pipelines that make it all the way into production to track metrics. Additionally that can span organizations at JPL.
For my current tasks the product based metrics and community metrics would be the first ones I'd be interested in at the start. This is not to say I would not be interested in DORA I just could not put it to good use at this time.
@PaulMRamirez Great point! I have updated the metrics guide to include some tools for measuring these. I have also found that Apache DevLake can track metrics related to community and product.
The performance of software delivery is a crucial aspect of development for teams and organizations. Accurate metrics play a pivotal role in providing insights into the efficiency and effectiveness of software development, testing, and deployment processes. This document offers an overview of various software metrics and their importance in assessing software delivery performance.
Measuring software delivery performance is essential for understanding and improving the software development process. Metrics offer a basis for evaluating the quality and efficiency of development, allowing teams to make data-driven decisions. Effective software delivery metrics can:
DORA (DevOps Research and Assessment) metrics are a set of key performance indicators (KPIs) used to evaluate the efficiency and effectiveness of software development and delivery processes. These metrics are particularly crucial for understanding the process aspects of software delivery. DORA metrics include:
Lead Time for Changes: Measures the time it takes to go from code commit to deployment in production. This metric helps assess the speed of feature delivery.
Deployment Frequency: Indicates how often code changes are deployed to production, promoting agility.
Change Failure Rate: Assesses the percentage of changes leading to service disruptions or defects after deployment, which highlights reliability.
Mean Time to Recovery (MTTR): Measures the time taken to recover from incidents or outages, reducing downtime.
When considering software metrics, it's important to categorize them into different areas, depending on the aspect of software development they address. These metrics can be broadly categorized into four key areas:
Product metrics focus on assessing the quality, performance, and user satisfaction related to the software product itself. These metrics are vital for understanding the result of the development process.
Process metrics, particularly DORA metrics, assess the efficiency and effectiveness of the software development process. They provide insights into how well teams execute their tasks, adhere to best practices, and ensure a streamlined delivery process.
Project metrics concentrate on the management and progress of individual software projects. They help track project timelines, budgets, and resource allocation to ensure projects stay on track.
People metrics evaluate the performance and satisfaction of individuals involved in the software development process. These metrics can include team productivity, job satisfaction, and skill development, offering insights into the human aspect of software development.
To effectively measure software delivery performance, organizations often rely on a combination of tools that cover a range of software metrics categories. These tools can help collect and analyze data to improve software development processes.
To make the most of software delivery metrics, consider the following steps:
Analyze and Interpret: Calculate and interpret the values of relevant metrics. Identify areas that require improvement based on the collected data.
Set Goals: Establish specific, measurable goals for each metric. Define targets that align with your organization's objectives and context.
Implement Changes: Make adjustments to your software development and delivery processes to enhance performance on these metrics. This may involve process optimization, automation, and improved collaboration.
Regularly Monitor and Adjust: Continuously track your performance against the selected metrics. Make changes to strategies and practices as needed to achieve your established goals.
Effective software delivery metrics, including DORA metrics, play a pivotal role in enhancing the software development process. By focusing on product, process, project, and people metrics, organizations can make data-driven improvements, set and achieve realistic goals, and ensure the delivery of high-quality software products.
@PaulMRamirez If you're okay with DevLake I can make a starter kit for that and you can try it out
As @ddalton-jpl has noted, DevLake does provide product and community metrics. For example, you could easily generate a timeline of product (portfolio) releases over time, or a burndown chart of issues resolved for the next target release.
There are a whole host of community metrics you can aggregate via DevLake, the examples they provide are:
Maybe more specificity from your end @PaulMRamirez on exactly what metrics you are interested in and why.
I also wanted to drop in software instrumentation metrics which can be provided via Apache Flagon. If this is of interest then we/I could build a Flagon integration for DevLake.
Interesting @lewismc - thank you for your advice and suggestions! If I understand Flagon right - it seems like an alternative for Google Analytics - mostly for websites? Or does it have the capability of somehow interacting with other types of user interfaces as well? For example: desktop clients, command-line tools, etc.
Hi @riverma Good questions. With Flagon, as long as your software can embed JavaScript code, then you can instrument it… The following resources demonstrate some example usage
In terms of comparison to Google Analytics. It is similar. A unique features of Flagon is that you can instrument user activity across a portfolio of applications. Google Analytics only exposes user events for one domain.
Hi @riverma Good questions. With Flagon, as long as your software can embed JavaScript code, then you can instrument it… The following resources demonstrate some example usage
- installation: import as library (NodeJS) or literally embed script into HTML or templating framework...
- webpack example
In terms of comparison to Google Analytics. It is similar. A unique features of Flagon is that you can instrument user activity across a portfolio of applications. Google Analytics only exposes user events for one domain.
Excellent - thank you for the clarification @lewismc!
@ddalton-swe @riverma I had a chance to go through your metrics guide on the repository that I'm testing (unity-sps), and it works well! It's easy to follow and straightforward. Thanks for your great work!
A couple of comments:
install_devlake.sh
goes toward the next steps even though it fails in the previous step. For example, Step 2 downloading failed, but it proceeded to the next steps. It may be confusing to the users when they see === Installation Completed ===
when it actually failed.
=== Apache DevLake Installation ===
Step 1: Prerequisites
Make sure you have Docker v19.03.10+ and docker-compose v2.2.3+ installed.
If you have Docker Desktop installed, docker-compose is already included.
Step 2. Downloading docker-compose.yml and env.example
/bin/bash: line 27: wget: command not found
/bin/bash: line 28: wget: command not found
chmod: docker-compose.yml: No such file or directory
chmod: env.example: No such file or directory
Step 3. Renaming env.example to .env...
mv: rename env.example to .env: No such file or directory
Step 4. Generating encryption key...
Step 5: Launching DevLake with Docker Compose
no configuration file provided: not found
no configuration file provided: not found
Step 6: Collect data and view dashboards
Visit http://localhost:4000 in your browser to configure DevLake and collect data.
=== Installation Completed ===
Need to fix the link below:
We value your feedback and welcome contributions to improve this guide. Please see our [contribution guidelines](https://link-to-contribution-guidelines/).
@yunks128 Thanks for catching that! https://github.com/ddalton-swe/slim/tree/issue-117. @riverma should I make another PR for this fix?
@yunks128 Thanks for catching that! https://github.com/ddalton-swe/slim/tree/issue-117. @riverma should I make another PR for this fix?
Thanks @ddalton-swe! A quick PR would be great - thanks.
Checked for duplicates
Yes - I've already checked
Category
None
Describe the need
We have a need for software metrics approach for use toward things such as tracking adoptions and costing of features (e.g. effort required). The desire is for these to be developer centric metrics and useful for project planning purposes. Some examples/questions driving this request:
The goal would be to understand the tooling and processes available along with what are the common approaches