Back to Blog
Developer Metrics
Best Practice

From insight to impact: Key takeaways from our DORA webinar with Nathen Harvey

Cortex

Cortex | September 18, 2025

From insight to impact: Key takeaways from our DORA webinar with Nathen Harvey

For most engineering leaders, getting a DORA dashboard up and running feels like a huge win. You can finally track performance, compare it to industry benchmarks, and report on your progress. But then a nagging question settles in: how do you actually make the numbers go up?

That frustration points to a common gap between the dashboard and the daily engineering practices that drive those outcomes. As part of a wide ranging conversation on how to bridge this gap, our co-founder and CTO, Ganesh Datta, was joined by Nathen Harvey, who leads the DORA program at Google Cloud. During their 45-minute chat, they explored how to use DORA metrics to diagnose bottlenecks in daily engineering practices and drive meaningful improvements.

Missed the live event? Watch the recording & read the recap of some of the key questions they discussed.

Why DORA metrics should be indicators, not targets

While it’s tempting to set a target for Deployment Frequency or MTTR, it’s a mistake both Ganesh and Nathen were quick to point out. Ganesh compared it to the classic trap of chasing test coverage. If you make 90% coverage a goal, you'll get a lot of tests, but you won't necessarily get a more reliable system. The same is true for DORA, where treating metrics as targets leads to unnecessary gamification.

Nathen added that this problem is often made worse by how organizations structure ownership. He continued, "A decade ago, we might have turned to developers and said, 'Those throughput metrics, they're yours,' and then turned to our operators and said, 'Those stability metrics, you're responsible for those.' When you do that, you're setting up this fight against one another."

"If you treat a metric as a goal, people will naturally gamify it. Chasing metrics means that you lose sight of the outcomes, which leads to bad incentives. Instead, they should be used to diagnose system bottlenecks."

– Ganesh Datta, Co-founder & CTO, Cortex

The real value of DORA is as a diagnostic tool. A dip in a metric tells you that you have a problem, but it doesn't tell you why. The key is to use the metrics to start a conversation, not end it.

To improve your outcomes, focus on your capabilities

So how do you actually improve your DORA outcomes? Ganesh and Nathen argued that it comes down to improving the underlying capabilities that predict success—which means focusing on the leading indicators you can directly control.

"You don't improve deployment frequency by going and shouting at the team, 'You have to deploy more frequently!' As a team, we don't get better at deployment frequency by thinking really hard about deployment frequency. Instead, we need to think about the capabilities that are actually driving those particular metrics."

– Nathen Harvey, Head of DORA, Google Cloud

To illustrate this further, Ganesh and Nathen used code reviews as a prime example. Slow code reviews are a common bottleneck that directly impacts lead time, and they argued that any potential fix should be treated as a simple experiment.

  • Form a hypothesis: Start with something simple like, "We believe that by doing pair code reviews, we will improve our MTTR because more people will understand the code."

  • Run an experiment: Implement the new practice for a set period.

  • Measure the impact: Use your DORA dashboard and other metrics to see if the change had the desired effect.

Ganesh explained that what looks like a simple bottleneck is often a deeper issue. Perhaps senior engineers are the only ones doing reviews, creating a knowledge silo. Maybe the team lacks confidence in the test suite, forcing them to do manual QA on every PR. Or in some cases, it could be that unclear product specs leave developers to debate requirements in the review process.

This approach gives teams ownership over their improvement process and turns engineering metrics into a tool for learning, not just measurement.

How AI is impacting software delivery performance

It wouldn’t be a webinar in 2025 without a discussion on AI. When the conversation turned to AI, Ganesh and Nathen agreed that while AI is changing software development, the DORA metrics remain more relevant than ever as a way to measure its impact.

"What we're seeing in 2025 is that throughput is actually improving as you use more AI. Stability still has some challenges; it's still not improving, in fact, it gets a little bit worse with AI."

– Nathen Harvey, Head of DORA, Google Cloud

Ganesh added that the conversation around AI will quickly move beyond ROI. The real challenge for engineering leaders will be managing the double-edged sword of AI-generated code. As he put it, "Are your teams using [AI] to write more tests, improving your change failure rate? Or do you have worse MTTR because your teams understand the code less?" The four key DORA metrics are essential for measuring the true impact—both positive and negative—of AI on your system.

Putting it all into practice with an IDP

Ultimately, you need a tool to connect the insights from your DORA dashboard to the actions that drive real improvement. An Internal Developer Portal (IDP) like Cortex is designed to bridge the gap between insight and action.

As Ganesh shared, customers using Cortex to connect insights to action have seen dramatic results, including a 50-60% reduction in MTTR and the number of incidents by driving improvements through Scorecards.

"The dashboard tells you you have a problem. Scorecards tell you why you have that problem. And Initiatives give you a way to go and fix it."

– Ganesh Datta, Co-founder & CTO, Cortex

To help you get started, we're excited to announce two new resources: a new Cortex Academy course on Operationalizing DORA Metrics and a new DORA Operational Readiness Scorecard template that measures both leading and lagging indicators. If you don’t yet have Cortex, book a demo!

Want to watch the full webinar? Check out the recording below.

Begin your Engineering Excellence journey today