Are You Using or About to Use DORA metrics? Read this First.
Since Accelerate was published in 2018, “DORA metrics” have become increasingly popular for measuring IT performance. More vendors are creating dashboards or integrating these metrics into their existing pipeline tooling. However, the context is getting lost in the race to deliver these features to customers.
Bryan Finster (Guest)
August 15, 2022
Since Accelerate was published in 2018, “DORA metrics” have become increasingly popular for measuring IT performance. More vendors are creating dashboards or integrating these metrics into their existing pipeline tooling. However, the context is getting lost in the race to deliver these features to customers.
First, what are the DORA metrics?
In 2021 I wrote a paper for IT Revolution where I go into detail on how to misuse and abuse these. Today, let’s cover some high-level tips to consider before attempting to use these.
1. Don’t Use Them
More specifically, don’t naively use them without understanding what they represent and what they do not. Having good DORA metrics does not mean you are a high-performing organization. Delivering very stable, very small batches of useless crap doesn’t make you high performing. However, delivering large batches and/or having an unstable system will definitely cause a negative impact on your business performance. Do not use them to track positive performance. The correct way to use them is as an indicator for things that could be improved so you can investigate “what’ and “how”.
2. Understand the Definitions
I’ve reviewed many vendors’ implementations of DORA metrics and most of them use incorrect definitions.
Most tools define “Change Fail %” as the percentage of changes that cause an outage or otherwise need to be backed out. Nope. Read “Accelerate”.
“…result in degraded service or subsequently require remediation (e.g., lead to service impairment or outage, require a hotfix, a rollback, a fix-forward, or a patch).”So, a change that results in a defect. Any defect.
Another that is almost always measured incorrectly is “lead time”. This is almost always measured from when the code is checked in until it is delivered, but that’s only the automated portion. In a follow-up response to a critical book review, Jez and Nicole state,
“But again, going back to first principles, going from starting to write code to checking in, and from releasing to getting feedback from production, should be fast and low variability processes and therefore belong in the delivery domain.”Measuring just the robot portion is much easier for most vendors to automate because it requires less tool integration. However, it tells you almost nothing about where improvement opportunities exist. Most of the issues are upstream of there. Measure the entire development flow.
There are more incorrect definitions that tools use. Read “Accelerate”, understand the intent, and don’t blindly trust the implementation of a tool.
3. Use All or None
“This quarter we’ll focus on improving delivery frequency. next quarter we’ll focus on the next metric.”
Rapid delivery without a disciplined quality process is just dangerous. Speed isn’t the goal. Increased quality feedback is the goal. We need signals for quality and batch size.
4. They are Lagging Indicators
While they can be leading indicators for IT performance, they are lagging indicators for engineering excellence and good product management. Measuring how frequently working code is integrated into the trunk and the wait times for handing off work will help identify things that will improve the DORA outcomes.
5. How to Improve?
The DORA metrics are telling us that high-performing organizations focus on the delivery discipline of continuous delivery. Focus on “why can’t we deliver working software daily?” and fix those things.
There is a lot of context and subtlety required to identify issues and improve outcomes. Simply publishing DORA metrics is usually a destructive act. Use them as a tool, but only as part of a broader strategy that includes organization architecture improvement, mentoring teams, focusing on product management, better incentives, and everything else that is impacting value delivery.
(This post was originally published on August 13 by Bryan Finster on his blog post titled: 5-Minute DevOps: DORA Metrics Tips)
More articles for you
Editor's pick
Ben Cochran, VP of Developer Enablement at Autodesk, sat down with Vitaly Gordon, Co-founder and CEO of Faros AI, at the San Francisco Engineering Leadership Council annual event, for a conversation about Autodesk’s developer productivity case study and data-driven approach to engineering.
Editor's pick
GitHub Copilot is one of the fastest adopted tools in the history of software development. One year after its release, over 1 million developers and 20,000 organizations are using the tool. But how to measure its impact on your engineering operations? Read on..
Editor's pick
I am delighted to announce that we have raised $20M in Series-A financing, led by one of the most successful VC investors in Silicon Valley and the first board member in iconic companies like Gitlab, Splunk, Fastly, and Bill.com, David Hornik of Lobby Capital...
Get started with Faros AI today!
Start your free trial now and get the full picture in minutes.
No credit card required.