Skip to main content

Understanding the Differences Between PR Size and Code Changes in LinearB

Understand the key differences between PR Size and Code Changes in LinearB to accurately measure development effort, track modifications, and optimize your engineering workflows.

WorkerB Inline Approval

WorkerB Inline Approval. Users of WorkerB can now receive previews of PRs with under 5 lines of code changes directly in Slack. Approving a Pull Request. When a small PR is assigned to you, an alert…

LinearB Metrics Glossary

A complete guide to how LinearB defines and calculates core engineering metrics. Use this glossary to align your teams on what each metric means, how it is calculated, and why it matters. Delivery Me…

Release Notes - November 2025

Improved accuracy for AI Code Reviews. Upgrade to Claude Sonnet 4.5 The latest Claude AI model, Sonnet 4.5, now powers the LinearB AI Code Review. With this upgrade, our internal testbench score impr…

Configuring GitHub Copilot

Track GitHub Copilot adoption, usage, and acceptance rates across your teams. Connect Copilot to LinearB to understand how AI-generated code influences delivery and quality

New Code Metric

New Code measures the percentage of added lines relative to total code changes in merged pull requests across the selected time range.

Configuring Cursor

Measure Cursor AI usage across repositories and developers. This integration surfaces request activity, code suggestions, and acceptance metrics to help you assess AI effectiveness.

AI Tools Usage

Analyze how developers use AI coding tools like GitHub Copilot and Cursor. This view surfaces active users, acceptance rates, and code contribution trends so you can evaluate trust, engagement, and real productivity impact across teams.

AI Iteration Summary for Teams

The AI Team Iteration Summary in Iterations provides a concise, AI-generated overview of your team’s performance across a completed iteration. It highlights key accomplishments, delivery insights, an…

Understanding the Iterations View

Get full visibility into your team’s progress and delivery with Iterations. Track real-time work, analyze past iterations, and connect Git activity to project issues for accurate planning, retrospectives, and continuous improvement.

Release Notes - July 2025

Smarter reviews, new AI commands, and stronger security for gitStream.

Feature - AI Insights

AI Insights – Panels & Metrics. AI Insights gives you a unified view of how AI is impacting your engineering workflow. It combines AI-detected issues, adoption signals, tool usage, and repository con…

Understanding Code Changes

Understanding Code Changes. General. Code changes is a metric that measures the changes that are done in the code base over time. It includes either new code or refactored code and is counted in a nu…

Configuring Claude Code

Track Claude Code adoption and usage across your teams. Connect Claude to understand how AI-attributed code activity contributes to delivery insights.

AI Metrics Explained

The AI Metrics Glossary explains the metrics shown in AI Analytics. Use this page to understand what each metric measures, what it indicates, and how to interpret changes over time. Metrics may vary…

Contact