Skip to main content
Table of Contents

Feature - AI Insights

AI Insights – Panels & Metrics. AI Insights gives you a unified view of how AI is impacting your engineering workflow. It combines AI-detected issues, adoption signals, tool usage, and repository con…

heather.hazell
Updated by heather.hazell

AI Insights – Panels & Metrics

AI Insights gives you a unified view of how AI is impacting your engineering workflow. It combines AI-detected issues, adoption signals, tool usage, and repository configuration data so you can see where AI is used, how broadly it’s adopted, and what impact it has on code and reviews. Chatbot, CI, and system bot accounts are excluded from all AI Insights metrics.


TL;DR – What you can see in AI Insights

  • Issues Feed – Issues flagged by AI during pull request reviews.
  • AI Adoption – AI involvement across commits, review comments, and PR authors.
  • AI Rule Files – Where AI agents are embedded via repository configuration.
  • AI Tools Usage – Adoption and acceptance metrics for tools like GitHub Copilot and Cursor.
  • AI Iteration Summary – AI-generated summaries of completed iterations in Teams → Iterations.

Prerequisites

  • AI Services enabled under Settings → General → AI Services.
  • Managed Mode selected for LinearB AI & Automations (Settings → Company Settings → AI Tools).
  • Optional, but recommended:
    • GitHub Copilot connected in AI Tools for Copilot metrics.
    • Cursor connected in AI Tools for Cursor metrics.

Issues Feed

The Issues Feed highlights issues flagged by AI during pull request reviews and links directly to the underlying PRs.

  • Each row links to the pull request where the issue was detected.
  • AI-generated comments are attributed separately from human comments.
  • System bots and chat services are excluded so you see developer + AI collaboration only.

Issues are grouped into categories such as:

  • Bugs
  • Scope
  • Maintainability
  • Performance

Use the Issues Feed to see where AI is catching potential problems and to jump straight from a summary into the exact PR in GitHub for deeper review.


AI Adoption

The AI Adoption panel shows how AI tools participate in three core activities:

  • Commits – split into manual commits vs. AI-assisted commits (where AI is listed as a co-author).
  • Review Comments – broken down into human-written vs. AI-generated comments.
  • PR Authors – counts unique authors opening pull requests with AI involvement (for example, Agents as co-authors).

AI involvement is detected using commit metadata and co-author fields for known AI agents.

Filter: Use Show only AI activity at the top of the panel to focus exclusively on AI-assisted actions (commits, comments, and authors with AI involvement).

How to interpret AI Adoption

  • A growing share of AI-assisted commits and comments indicates increasing trust and reliance on AI tools.
  • If only a few PR authors show AI activity, AI may still be in the hands of early adopters.
  • Broad AI involvement across many authors suggests AI is becoming part of everyday development practices.

AI Adoption is not a performance score; it shows usage patterns and workflow evolution, not individual productivity.

Excluded from AI Adoption:

  • Chatbot accounts
  • Automation-only bots
  • Non-human system users

AI Rule Files (repository-level adoption)

The AI Rule Files panel tracks repositories that include AI-specific configuration files. These files show where AI agents are formally embedded into your workflow.

  • Repositories with rule files – repos that contain at least one AI configuration file.
  • Repositories without rule files – repos not yet configured for AI agents.
  • Breakdown by agent – which AI tools are configured and where.
  • Multiple rule types – repos that use more than one AI agent.

This view helps distinguish individual experimentation from formal adoption via repository configuration.

AI Insights monitors a wide range of AI coding tools (agents), such as Copilot, Cursor, and other coding assistants. If your AI tool is not reflected in the agent list or rule files view, contact LinearB to request support.


AI Tools Usage

The AI Tools Usage panel measures how developers use supported AI coding tools (currently GitHub Copilot and Cursor). Chatbot and bot accounts are excluded from these metrics.

Core metrics

  • Active Users – developers actively using the tool (for example, accepting AI suggestions, using AI chat, or triggering AI-powered summaries). Pure sign-in events are excluded.
  • Acceptance Rate – percentage of AI suggestions that are accepted into code.
  • Code Acceptance – trend of accepted AI-generated code over time, grouped by day.

GitHub Copilot specifics

For Copilot, LinearB uses GitHub’s official Copilot Usage API:

  • Active Users – users with Copilot activity on a given day (suggestions, chat, summaries).
  • Engaged Users – users who actively interact with Copilot features (for example, accepting suggestions).
  • Acceptance Rate – percentage of Copilot suggestions accepted.
  • Lines Written – trend of accepted Copilot-generated lines over time.

Cursor specifics

  • Active Users – developers using Cursor features daily.
  • Code Acceptance – accepted AI-generated code lines, grouped by day.

Adoption funnel

The panel supports an adoption funnel perspective:

  1. Are developers trying the tool? (Active Users)
  2. Are suggestions being trusted? (Acceptance Rate)
  3. Is it driving meaningful code volume? (Lines Written / Code Acceptance)

For example, high Active Users with low Acceptance Rate may indicate experimentation without trust yet. If both Active Users and Acceptance Rate trend upward, the tool is likely delivering real value.


Next steps & support

  • To configure AI automations and tools, see your organization’s AI Tooling guide.
  • If something looks off in AI metrics, capture an example (screenshot or PR link) and share it with your LinearB team or LinearB Support.

How did we do?

Feature - AI Tooling

Contact