Skip to main content

AI Metrics

Purpose. AI Metrics measure AI tool adoption, AI-assisted workflow interaction, and AI-attributed impact within the development lifecycle. These metrics help teams understand how AI tools are being u…

Steven Silverstone
Updated by Steven Silverstone

Purpose

AI Metrics measure AI tool adoption, AI-assisted workflow interaction, and AI-attributed impact within the development lifecycle.

These metrics help teams understand how AI tools are being used, how frequently they influence development activity, and how AI-assisted review affects quality and delivery signals.


Metrics Included

The following metrics belong to the AI Metrics family:

  • AI Active Users
  • AI Adoption Rate
  • Total AI Actions
  • AI-Assisted Pull Requests
  • AI-Identified Issues
  • AI-Suggested Code Lines
  • AI Velocity / Workflow Metrics

Each metric definition, attribution logic, normalization method, and display behavior is documented in the Metrics Glossary.


How These Metrics Work Together

AI Metrics provide visibility across three dimensions:

  • Adoption – How many developers are actively using supported AI tools.
  • Usage Intensity – How frequently AI actions occur within the selected time range.
  • Impact – How AI-assisted work influences review, quality, and delivery metrics.

These metrics should be interpreted alongside Delivery, Throughput, and Quality metrics to evaluate workflow impact.


Dashboards That Use AI Metrics

AI metrics appear in:

AI dashboards combine adoption and impact metrics to provide visibility into AI-driven workflow changes.


AI metrics often correlate with:


Configuration Dependencies

AI metrics rely on:

  • Supported AI tool integrations (e.g., Copilot, Cursor, Claude Code)
  • Git provider integration and PR activity ingestion
  • Correct user attribution mapping

If AI integrations are not enabled or syncing, AI metrics may not appear.


Troubleshooting

If AI metrics appear missing or inconsistent:

  • Confirm AI tool integrations are connected and authorized.
  • Validate developers have recorded AI activity during the selected time range.
  • Ensure Git activity is syncing normally.

For detailed troubleshooting guidance, see: Troubleshooting .

How did we do?

Balance & Activity Metrics

Contact