Skip to main content
Table of Contents

AI Insights – Overview

Understand how AI tools affect your engineering performance. AI Insights connects AI usage, acceptance, and trust signals to delivery, quality, and DORA metrics so you can measure real impact, not just adoption.

Steven Silverstone
Updated by Steven Silverstone

The AI Insights page gives you a unified, real-time view into how AI is impacting your engineering workflow. It combines AI-detected issues, adoption signals, tool usage, and repository configuration data to help you understand where AI is being used, how deeply it’s embedded, and what impact it’s having.

AI Insights overview dashboard in LinearB

AI Insights reflects human + AI collaboration. Chatbots, CI bots, and automated system accounts are excluded from all metrics.

What AI Insights helps you answer
  • Where is AI actively contributing to code, reviews, and pull requests?
  • Which issues are AI agents flagging during code review?
  • Which AI tools are developers actually using — and trusting?
  • Where are AI agents formally embedded through repository configuration?
  • Is AI adoption broadening across teams or limited to early adopters?

What’s included in AI Insights

AI Insights is composed of several panels, each covering a different dimension of AI usage:

  • Issues Feed – Issues flagged by AI during pull request reviews
  • AI Adoption – AI involvement across commits, review comments, and PR authors
  • AI Rule Files – Repository-level configuration showing where AI agents are embedded
  • AI Tools Usage – Adoption and acceptance metrics for supported AI tools

Issues Feed

The Issues Feed highlights issues flagged by AI during pull request reviews, with direct links back to the relevant PRs for context.

  • Each row links directly to the pull request where the issue was detected
  • AI-generated comments are attributed separately from human comments
  • System bots and chat services are excluded

Issues are categorized as:

  • 🐞 Bugs
  • 🎯 Scope
  • 🧹 Maintainability
  • 🚀 Performance
Click any PR name to jump directly into GitHub and review the flagged issues in context.

AI Adoption

The AI Adoption panel shows how AI contributes across three core activities:

  • Commits – Manual vs. AI-assisted commits, measured by detecting AI agents listed as co-authors
  • Review Comments – Human-written vs. AI-generated review comments
  • PR Authors – Unique authors opening PRs with AI involvement

Use Show only AI activity to filter the view and focus exclusively on AI-assisted actions.

Commits or PRs listing both a human and an AI tool as co-authors are counted as AI-assisted work.

AI Rule Files (Repository-level adoption)

The AI Rule Files panel tracks repositories that include AI-specific configuration files. These files indicate where AI agents are formally embedded into workflows.

  • Repositories with rule files – Repos containing at least one AI configuration file
  • Repositories without rule files – Repos not yet configured for AI agents
  • Breakdown by agent – Which AI tools are configured and where
  • Multiple rule types – Repos using more than one AI agent

This view helps distinguish individual usage from organizational adoption.


Agents monitored by LinearB

AI Insights identifies activity from a wide range of AI coding tools, referred to as agents. Agents actively contribute to code or reviews and are treated differently from automation bots.

Examples of monitored agents include:

aider, Amazon Q CLI, augmentcode, bito, claude, cursor, copilot, devin-ai, gemini, gitlab duo, graphite, replit, sourcegraph, tabnine, windsurf, and many others.

This list is continuously updated. If your AI tool is missing, contact LinearB to request support.

AI Tools Usage

The AI Tools Usage panel measures how developers use supported AI tools. LinearB currently supports GitHub Copilot and Cursor.

  • Active Users – Developers actively using AI features
  • Acceptance Rate – Percentage of AI suggestions accepted into code
  • Code Acceptance – Trend of accepted AI-generated code over time

These metrics help you understand not just adoption, but trust and value.


How to interpret AI Insights
  • High usage + low acceptance may indicate experimentation without trust
  • Rising acceptance over time suggests AI is delivering real value
  • Rule files spreading across repos signal formal organizational adoption
AI Insights is designed to show patterns and trends, not to evaluate individual developers.

Why AI Insights matters

AI Insights turns raw activity into visibility.

  • For leaders: clear signals of ROI and adoption maturity
  • For managers: insight into quality, speed, and review effectiveness
  • For developers: transparency into how AI influences daily work

Most importantly, it helps answer the questions leadership always asks:

  • Are we shipping better code?
  • Are we moving faster?
  • Is AI actually helping?

Glossary
  • AI-assisted work – Commits or PRs involving both a human and an AI tool
  • Agent – An AI coding assistant contributing to code or reviews
  • Bot – Automated system account (excluded)
  • Acceptance Rate – Percentage of AI suggestions accepted
  • Active User – Developer actively using AI features

For setup guidance and configuration, see:

For additional help, contact support@linearb.io or visit the LinearB Help Center .

How did we do?

AI Adoption

AI Iteration Summary for Teams

Contact