Table of Contents
AI Tools Usage
Analyze how developers use AI coding tools like GitHub Copilot and Cursor. This view surfaces active users, acceptance rates, and code contribution trends so you can evaluate trust, engagement, and real productivity impact across teams.
The AI Tools Usage panel measures how developers use supported AI coding tools across your organization. LinearB currently supports GitHub Copilot, Cursor, and Claude.
Overview
- Track AI tool adoption and usage across developers.
- Measure trust and value using acceptance and trend metrics.
- Use drilldowns (where available) to move from percentages to underlying counts.
What the panel measures
For each supported tool, the panel provides the following metrics:
- Active users – Number of developers actively using the tool. Measured by counting developers who accept AI code suggestions, use AI chat, or trigger AI-powered PR summaries. Authentication-only events are excluded.
- Acceptance rate – Percentage of AI suggestions accepted into code. Measured by comparing suggestions offered vs. suggestions accepted.
- Code acceptance – A trend chart showing accepted AI-generated code over time. Measured by grouping acceptances by day.
GitHub Copilot
LinearB tracks GitHub Copilot using GitHub’s official Usage API.
Organization-level Copilot metrics may appear in AI Insights even if no users are shown as connected in the Users page. User-level visibility depends on Copilot usage metrics being enabled in GitHub and successful GitHub-to-LinearB identity mapping.
- Active Users – Users with any Copilot activity in a given day (for example, receiving a suggestion, accepting a suggestion, or prompting chat). Authentication-only events are excluded.
- Engaged Users – Users who actively interacted with Copilot features (for example, accepting suggestions, prompting chat, or triggering a PR Summary). Authentication-only events are excluded.
- Acceptance Rate – Percentage of Copilot suggestions accepted into code.
- Code Acceptance – Trend of accepted Copilot-generated code over time, grouped by day.
Cursor
- Active Users – Developers using Cursor daily.
- Code Acceptance – Number of AI-generated code lines accepted, grouped by day.
Claude
LinearB tracks Claude usage when connected via the supported Anthropic API integration.
- Active Users – Developers with recorded Claude coding activity during the selected period. Authentication-only events are excluded.
- Code Acceptance – Accepted Claude-generated code over time, grouped by day.
Claude accessed via AWS Bedrock is not currently supported.
Deeper visibility into coding behavior
The AI Tools Usage panel provides drilldowns so you can move from high-level percentages to underlying counts and trends.
-
Active Users (%)
- Percentage of developers actively using the selected AI tool.
- Click to see the number of users not using the tool.
-
Acceptance Rate (%)
- How often AI suggestions are accepted.
- Click to see the breakdown of accepted vs. rejected suggestions.
-
Lines Written
- Trend of accepted AI-generated lines over time.
- Hover or click to see the number of lines written by date.
Example: If Active Users are high but Acceptance Rate is low, developers may be experimenting without trusting suggestions yet. If both trend upward, the tool is delivering measurable value.
Adoption funnel
The panel also supports an adoption funnel view:
- Are developers trying the tool?
- Are its suggestions being trusted?
- Is it driving meaningful code volume?
If a developer uses an AI tool as a co-author, their work is included as AI-assisted. Chatbot activity is not included.
How did we do?
AI Rule Files
Configuring Claude Code