Skip to main content

AI Analytics: Adoption, Workflow, and Impact

Use AI Analytics to measure AI adoption, workflow touchpoints, and impact across pull requests, repositories, and teams. This is the supported path after the Copilot/Cursor Metrics dashboards and AI Tools Usage view were deprecated on April 2, 2026.

Steven Silverstone
Updated by Steven Silverstone

AI Analytics (Beta) provides visibility into how AI is used across your pull request workflow and how that usage correlates with engineering activity. These reports help teams understand adoption patterns, workflow touchpoints, and the impact of AI tools across repositories and teams.

The standalone Copilot and Cursor Metrics dashboards, along with the AI Tools Usage view in AI Insights, were deprecated on April 2, 2026. Use AI Analytics for more accurate, filterable, and team-scoped reporting on AI adoption and impact.

Why AI Analytics replaced the legacy dashboards

The retired dashboards relied on third-party API data and org-level metrics that could include contributors outside your LinearB team scope. This could lead to inconsistent reporting, limited filtering, and results that did not accurately reflect team activity.

AI Analytics provides a more accurate and supported approach by focusing on pull request–level activity within your configured teams and repositories.

  • Analyze AI activity across pull requests, commits, and review workflows.
  • Filter results by workflow touchpoint, team, repository, and user.
  • Compare AI-assisted work with non-AI work using the same reporting surface.
  • Ensure reporting reflects only contributors within your LinearB team scope.

What changed:

  • If you previously used Copilot or Cursor Metrics dashboards, use AI Analytics for team-scoped and PR-level reporting.
  • If you used the AI Tools Usage view, continue using AI Insights for high-level visibility and AI Analytics for detailed analysis.

Note: The legacy dashboards are no longer supported and will not receive updates.


Accessing AI Analytics
  1. From the left navigation, select Metrics.
  2. Select AI Analytics (Beta).

How AI activity is detected

Coding Agents

AI systems that independently initiate coding work. Identified by pull requests opened by known AI agents, or by branches created using recognized AI-agent naming patterns.

Coding Assistance

AI tools that contribute to a developer’s commits. Identified by a known AI tool listed as a co-author in the commit message, or by correlating the AI tool’s reported activity with the user’s Git activity.

Code Review

AI-generated review activity. Identified by pull request comments from known AI review tools that match recognized review patterns.

AI Coding

AI Coding is the combined coding-stage view in AI Analytics. It includes both Coding Assistance and Coding Agents.


Report

The Report selector controls which AI Analytics report is displayed.

For example, you can open a report such as Throughput Impact and then adjust the surrounding controls to change the date range, interval, grouping, and filters.

The available reports may evolve over time while AI Analytics remains in Beta.


Group by

Group by controls how results are segmented within the chart.

For example, when grouping by AI Coding, results are split into:

  • With AI — Pull requests that involved AI at the coding stage, including Coding Assistance or Coding Agents.
  • Without AI — Pull requests with no detected AI involvement at the coding stage.
Some grouped views show the top values directly and place the remaining values in Others.
Group by affects how data is displayed. Filters affect which pull requests are included.

Filters and logical behavior

Filters determine which pull requests are included in the report.

  • AND logic across filters: A pull request must match all selected filters.
  • OR logic within a filter: A pull request can match any selected option within that filter.

Filter definitions

AI Coding

Filter by AI or human touchpoints at the coding stage. With AI includes pull requests or branches created by an AI tool, commits with a known AI co-author, or cases where reported AI activity correlates with the user’s Git activity.

AI Coding includes both Coding Assistance and Coding Agents.

Coding Assistance

Filter by coding-assistance activity within the selected pull requests.

Code Review

Filter by AI-generated review activity within the selected pull requests.

Coding Agents

Filter by pull requests or branches initiated by recognized AI agents.

Teams

Select the teams you want to filter by.

Repositories

Select the repositories you want to filter by.

Users

Select the users you want to filter by.

If Cursor, Copilot, or Claude Code user accounts are not merged with team members, some activity may be missing from team-scoped data.

Interacting with charts

Hover over a point on the chart to see the exact values for that period.

Depending on the selected report, the tooltip and chart labels may show values such as pull requests, averages per interval, and comparisons between AI-assisted and non-AI activity.

Use this interaction to validate trends and compare AI-assisted and non-AI activity over time.


Tip: When validating AI detection, start with a single repository and one workflow stage, then expand filters gradually.

How did we do?

AI FAQs - Governance, Detection & Trust

Contact