Skip to main content

PRs Merged Without Review Metric

Measures the average number of pull requests merged without recorded review activity, normalized per day across the selected time range.

Steven Silverstone
Updated by Steven Silverstone
Definition

PRs Merged Without Review measures the number of Pull Requests that were merged without any recorded review activity.

A PR is considered “without review” if it was merged and no review events (such as approval or review comments) were recorded before merge.

How the Metric Is Calculated

The metric is calculated as: Number of merged PRs with no associated review events

In the dashboard, this value is normalized as: PRs per day

This metric is normalized to PRs per day to allow comparison across time ranges.

How the Metric Is Displayed in the Dashboard

The metric card displays two types of values:

1. Headline Value (e.g., 0.09 PRs per day)

The large number shown at the top represents the average number of PRs merged without review per day across the selected time range.

This is a daily average — not a total count.

2. Time-Based Values in the Chart

The line chart shows how many PRs were merged without review in each time bucket (for example, per day).

When you click a point in the chart, you see:

  • The exact number of PRs merged without review on that date

This helps identify:

  • Specific days where review processes were bypassed
  • Sudden spikes in risk exposure
  • Patterns during high-pressure delivery periods
Why This Metric Is Useful

PRs merged without review increase production risk.

A lack of review may lead to:

  • Undetected bugs
  • Security vulnerabilities
  • Reduced code quality
  • Knowledge silos

Monitoring this metric helps ensure:

  • Review policies are enforced
  • Risk exposure is minimized
  • Quality standards are maintained
How to Interpret This Metric

This metric should generally remain low.

As general guidance:

  • Fewer than 5% of total PRs should bypass review in a disciplined engineering organization.

However, exceptions may apply in cases such as:

  • Emergency hotfixes
  • Automated system merges
  • Small documentation updates

This metric should be evaluated alongside:

  • Time to Review
  • Time to Merge
  • Review Depth
Data Sources

Derived from:

  • Pull Request events
  • Review events (approvals, comments)
  • Merge timestamps
Tunable Configurations

Metric behavior may depend on:

  • Exclusion of specific branch types (e.g., hotfixes)
  • Review event definitions
  • Repository filtering rules
Limitations
  • Automated workflows may create false positives.
  • Small teams may legitimately self-review.
  • The metric measures the absence of review activity — not review quality.

Spikes should be investigated in context.

Stakeholder Use Cases

Engineering Managers

  • Monitor policy compliance
  • Identify risk spikes

Team Leads

  • Enforce review standards
  • Detect workflow bypass patterns

Developers

  • Maintain collaborative review culture

How did we do?

PR Size Metric

PRs Opened Metric

Contact