DORA Metrics Revealed in LinearB 2023 Engineering Benchmarks Report

0
LinearB unveiled its 2023 Engineering Benchmarks Report offering valuable insights into the key performance metrics that distinguish elite engineering organizations from their peers. The report builds on industry-standard research conducted by the DevOps Research and Assessment (DORA) team – which LinearB is now a principal contributor to – and introduces new benchmarks designed to give teams a better understanding of their performance today and help build a strategy that drives improvements necessary to impact business goals.

DORA metrics — deployment frequency, lead time for changes, mean time to recovery and change failure rate — improve visibility into software developers’ daily operations. The DevOps landscape has drastically evolved since DORA’s inception in 2014, though, and these metrics alone aren’t sufficiently measuring the behaviors that drive value for engineering teams. DORA metrics function as lagging indicators, reflecting past or present performance without effectively measuring business impact. This means that while dev teams may excel in these metrics, they do not necessarily guarantee teams are driving value related to overall business goals.

Building on the research conducted by the DORA team, LinearB’s Engineering Benchmarks Report — created from a study of 1,971 dev teams and +4.5M branches — introduces new metrics designed to overcome the limitations of traditional benchmarks and foster a framework designed for growth, value and efficiency. These new metrics address concerns such as lagging indicators, the lack of business impact measurement and the need for actionable goals.

By incorporating these benchmarks into their workflows, engineering teams can better position their performance against the industry, identify unique opportunities for performance improvement and map a strategy to achieve elite performance.

“Our Engineering Benchmarks Report goes beyond the industry-standard DORA metrics to provide a more comprehensive view of engineering team performance,” said Ori Keren, co-founder and CEO of LinearB. “By understanding these metrics and leveraging the right tools, teams can attain elite performance in engineering efficiency, quality and job satisfaction.”

LinearB’s research has identified 10 new metrics grouped into three categories: delivery lifecycle, developer workflow and business alignment. These metrics, which include coding time, deployment frequency, code changes in pull requests, rework rate, planning accuracy, and capacity accuracy offer a more comprehensive and actionable view of engineering team performance. According to the Engineering Benchmarks Report, elite engineering teams:

  • Complete coding tasks in less than 30 minutes
  • Deploy on a daily basis
  • Average fewer than 105 code changes in their pull requests (PRs)
  • Maintain a rework rate below 2%
  • Achieve at least 80% planning accuracy
  • Maintain a capacity accuracy of of over 85%

 

The Engineering Benchmarks Report is now available for download, equipping engineering leaders with valuable insights to guide their teams towards elite performance. To access the report and learn more about LinearB’s software delivery management solutions, please visit the website HERE.  To learn more click HERE.

Related News:

Accelerate ServiceNow Delivery with xtype

Why Risk Your IT Environments With Homegrown Scripts?

Share.

About Author

Taylor Graham, marketing grad with an inner nature to be a perpetual researchist, currently all things IT. Personally and professionally, Taylor is one to know with her tenacity and encouraging spirit. When not working you can find her spending time with friends and family.