Table of Contents

Metrics Community Benchmarks

Metrics Community Benchmarks. The LinearB Engineering Metrics Benchmarks were created from a study of 3,694,690 pull requests from 2,022 dev organizations, spanning over 103,807 active contributors.…

Metrics Community Benchmarks

The LinearB Engineering Metrics Benchmarks were created from a study of 3,694,690 pull requests from 2,022 dev organizations, spanning over 103,807 active contributors. For the first time since DORA published their research in 2014, engineering teams are able to benchmark their performance against data-backed industry standards. Continue reading to learn more about our data collection and metric calculations here.

Benchmarks are available for metrics using average and p75.

We define 4 categories of quality:

Scroll to the left and right below to see all benchmarks.

Elite

Strong

Fair

Needs Improvement

Cycle Time

Average: < 73 hours

75th %: < 19 hours

Average: 73-155

75th %: 19-66

Average: 155-304

75th %: 66-218

Average: 304+

75th %: 218+

Coding Time

Average: < 19 hours

75th %: < .5 hours

Average: 19-44

75th %: .5-2.5

Average: 44-99

75th %: 2.5-24

Average: 99+

75th %: 24+

Pickup Time

Average: < 7 hours

75th %: < 1 hour

Average: 7-13

75th %: 1-3

Average: 12-20

75th %: 3-14

Average: 20+

75th %: 14+

Review Time

Average: < 5 hours

75th %: < .5 hour

Average: 5-14

75th %: .5-3

Average: 14-29

75th %: 3-18

Average: 29+

75th %: 18+

Deploy Time

Average: < 6 hours

75th %: < 3 hour

Average: 6-50

75th %: 3-69

Average: 50-137

75th %: 69-197

Average: 137+

75th %: 197+

PR Size

(number of code changes)

Average: < 219 lines

75th %: < 98 lines

Average: 219-395

75th %: 98-148

Average: 395-793

75th %: 148-218

Average: 793+

75th %: 218+

Rework Rate

Average: < 2%

Average: 2-5%

Average: 5-7%

Average: 7+

MTTR

Average: <7%

Average: 7-9%

Average: 9-10%

Average: 10+

CFR

Average: <1%

Average: 1-8%

Average: 8-39%

Average: 39+

Refactor

Average: <9%

Average: 9-15%

Average: 15-21%

Average: 21+

Merge Frequency

Measured in merges per developer, per week.

>2

2-1.5

1.5-1

<1

Deploy Frequency

Measured in deploys per developer, per week.

>0.2

0.2-.09

.09-.03

<.03

What is the difference between average and 75th percentile?

Measuring your organization's cycle time from the 75th percentile is LinearB's recommendation for the most effective method of finding a balanced representation of your organization's general cycle time, while not being impacted by outliers.

Average Calculation:

(Sum of branch times or PR sizes) / (Number of branch times or PR sizes)

75th Percentile Calculation:

(75% are shorter) → 75th Percentile ← (25% are longer)

You can set your LinearB instance to report in either average, median or the 75th percentile in metrics in your account settings. Read more about this feature here: Changing your metrics from average to median or percentile

Where to see your benchmarks

Benchmark indicators are visible on team dashboards as well as in Metrics reports. The main team dashboard will show your overall cycle-time against LinearB's benchmarks. You can see benchmark performance against individual portions of cycle time as well as other metrics

Dashboard View

When enabled a benchmark icon will appear next to the cycle-time metric in a team dashboard.

Metrics Report View

Metrics reports allow you to see your team performance against benchmarks on a metric by metric level. You can even combine multiple teams in a metrics dashboard and see the average performance of the teams against LinearB benchmarks. LinearB benchmarks are available on the following metrics:

  • Cycle Time
  • Coding Time
  • Pickup Time
  • Review time
  • Deploy Time
  • Deploy Frequency
  • Merge Frequency
  • PR Size
  • Rework Rate
  • MTTR
  • CFR
  • Refactor

Enabling and disabling benchmarks

We understand that benchmarks won't apply or be constructive for all teams. To disable the benchmarks icons for a specific team, go to Team Settings -> General to switch off Engineering Metrics Benchmarks. Make sure to click Save after disabling benchmarks.

How did we do?

How to select multiple teams in a metrics report

Metrics Dashboards: DORA Metrics

Contact