Table of Contents
Get Started - LinearB Essentials Setup Guide
This guide walks you through configuring LinearB Essentials. The Essentials plan focuses on real-time Git visibility, AI tooling insights, and automation using gitStream — with optional API-based DOR…
This guide walks you through configuring LinearB Essentials. The Essentials plan focuses on real-time Git visibility, AI tooling insights, and automation using gitStream — with optional API-based DORA configuration.
Milestone 1 — Git Configuration
1. Connect & Configure Git
Why this matters
Git is the foundation for all Essentials features — including AI activity tracking, PR analysis, and automations.
Connection note (Essentials):
Your primary Git provider is connected during initial login.
This step is about reviewing and refining your Git configuration so your data is clean and accurate.
Review Steps
- Go to Settings → Integrations → Git.
- Confirm the correct Git provider is shown (e.g., GitHub).
- Review the repository list and ensure all repos you want LinearB to track are selected.
- If you use on-prem or self-hosted Git, confirm any required webhooks are installed.
Recommended Configurations
1. Set Branch Exclusions
Why
Exclude test or non-production branches that shouldn’t count toward metrics to improve data accuracy.
Defaults
^develop$
^development$
^master$
^staging$
^integration$
^release$
^production$
^main$
Custom Examples
test/*
demo/*
sandbox/*
Where to configure
Set the organization-wide branch exclusion configuration
- In LinearB, go to Company Settings → Advanced.
- Scroll to the Exclude Branches section, tell LinearB which branches to ignore by adding one regular expression per line to describe the branch patterns
Override branch exclusion configuration per repo
Individual teams can use exclude a different branch exclusion configuration than the organization default.
- Go to Settings → Git.
- Select the gear icon to the right of the repo you'd like to configure differently.
- In the Exclude Branches section, tell LinearB which branches to ignore by adding one regular expression per line to describe the branch patterns
2. Set File Exclusions
Why
Prevents non-code files from affecting PR size, new code %, and overall accuracy.
Defaults
.*package-lock\.json$
.*dist/.*\.js
.*public/assets/.*\.js
.*\.(ini|csv|xlsx|txt|doc|rtf|ipynb|resx)$
Where to configure
Set the organization-wide file exclusion configuration
- In LinearB, go to Company Settings → Advanced.
- In the Exclude File Extensions section, tell LinearB which file extensions to ignore by adding one regular expression per line to describe the file extension patterns
Override file exlusion configuration per repo
Individual teams can use exclude a different file extension configuration than the organization default.
- Go to Settings → Git.
- Select the gear icon to the right of the repo you'd like to configure differently.
- In the Exclude File Extensions section, tell LinearB which file extensions to ignore by adding one regular expression per line to describe the file extension patterns
3. Configure Draft Pull Requests
Why
Draft Pull Requests ensure that early or incomplete work does not distort Cycle Time,
Pickup Time, or Review Time. LinearB automatically excludes draft PRs from key metrics
until they are ready for review, giving you a cleaner and more accurate view of delivery performance.
How LinearB identifies draft PRs
LinearB can treat a pull request as draft when it matches either (or both) of the following:
- Has one of your configured labels for draft work.
- Has a title that matches a regular expression pattern you define.
Where to configure Draft Pull Requests
Set the organization-wide draft pull request configuration
- In LinearB, go to Company Settings → Advanced.
- Scroll to the Draft Pull Requests section.
Override draft pull request configuration per repo
Individual teams can use a different draft PR configuration than the organization default.
- Go to Settings → Git.
- Select the gear icon to the right of the repo you'd like to configure differently.
- In the Draft Pull Requests section
Steps — Configure labels and title pattern
-
Specify a regular expression of a title to identify your draft work. For Example:
wipdo_not_mergework in progresswork_in_progress
- Click Save to apply your changes.
Note: Changes to draft PR settings are applied from the time of the saved configuration, moving forward.
Best practices
- Standardize on one or two labels (for example,
wipanddo_not_merge) across all teams. - Use labels for clarity, and the title pattern as a safety net for ad-hoc “WIP” titles.
- Review your existing PR conventions before changing these settings, so historical behavior remains intuitive for developers.
4. Bot exclusions
Why: Prevents CI/CD bots or automation accounts from appearing as “active contributors” or inflating commit counts.
How:
- Please contact support to send bot email addresses for exclusion.
- Coming Soon: Bot exclusion will soon be configurable in-product.
5. Global metrics calculation
Why
Determines how PR Size, Cycle Time, Pickup Time, Review Time, and Deploy Time aggregate.
Options & Examples
- Average – Sum ÷ count
Example: 4 PRs took 2, 4, 6, and 8 hours → (2 + 4 + 6 + 8) ÷ 4 = 5 hours average. - Median – Middle value
Example: PR merge times are 2, 4, 6, 8, 10 → Median = 6 hours. - 75th Percentile – 75% of PRs are faster than or equal to this.
Example: Most PRs take 2–6 hours but a few take 10+ → 75th percentile ≈ 6 hours (captures the “typical upper range”). - 90th Percentile – 90% of PRs are faster than or equal to this.
Example: Most PRs take 2–8 hours, but a few outliers take 20+ → 90th percentile ≈ 8 hours (shows performance excluding rare outliers).
How to choose the right calculation method
Different engineering teams prefer different aggregation strategies depending on their size, workflow variability, and how sensitive they are to outliers. Below is guidance to help you choose the method that best reflects your team’s “true” performance.
-
Average (Mean)
Best when your team’s PR sizes and cycle times are relatively consistent.
Why choose it: You want a simple, familiar metric and your data does not contain extreme outliers. -
Median
A stable representation of your “typical” PR because it ignores outliers on both ends.
Why choose it: Your workflow produces occasional very large or unusual PRs, and you don’t want those to skew the overall metric. -
75th Percentile (Industry Standard)
Captures the performance of most PRs while still reflecting friction in the system.
Why choose it: You want a balanced view that smooths outliers but still highlights when work takes longer than expected. Most DORA-focused teams prefer this. -
90th Percentile
Highlights the slowest part of your process without being dominated by true one-off outliers.
Why choose it: You want more sensitivity to delays—great for teams optimizing flow efficiency or monitoring tail behavior.
Where to configure
Configure at Company Settings → General → Metrics Calculation.
Outcome: Repos are correctly selected, noisy branches/files/bots are excluded, and your metrics are aggregated in a stable, consistent way.
Milestone 2 — User Management
1. User Access & SSO
Why
Proper user and permission configuration ensures the right team members can see dashboards, AI insights, and manage automations.
User Roles
When creating or modifying user permissions, ensure all active users have correct access. See permissions matrix below as reference.
User Permissions Matrix
| Access | Admin | Editor | Viewer | Basic |
|---|---|---|---|---|
| Company Settings | View/Edit | No Access | No Access | No Access |
| Connect Git/PM/Slack/MS Teams | View/Edit | No Access | No Access | No Access |
| User Settings | View/Edit/Invite | No Access | No Access | No Access |
| Teams Settings & Team Contributors | View/Write | View/Write* | No Access | No Access |
| Dashboards | View/Write | View/Write* | View* | No Access |
| Projects Tab | View | View* | View* | No Access |
| Teams Tab | View/Write | View/Write* | View/Write* | No Access |
| Surveys | View/Create/Delete | Response | Response | Response |
| Metrics Tab / Reports | Public & Private Reports View/Create/Delete |
Private Reports View/Create/Delete |
View | No Access |
| Project Delivery Trackers | View/Create/Edit/Delete | View/Create/Edit/Delete | View | No Access |
| Resource Allocation | View/Edit | View | No Access | No Access |
| Investment Strategy Report | View/Edit | View | View | No Access |
* Editors cannot create new teams; they can only edit existing teams.
* Access for assigned teams onl
Adding New Users
- Go to Settings → Users & Teams.
- Select Users from the toggle at the top.
- Click + Add User.
- Add the user’s name, email, and role.
- Assign team membership (if applicable).
- Save & Close.
Managing Existing Users
- Edit a user’s profile via the ⋮ menu → Edit Profile.
- Add or remove team membership.
- Update permission level.
Merging Duplicate Users
If users appear multiple times due to multiple email identities:
- Open Users view.
- Select the primary account from the list.
- Click the ⋮ → Merge Account.
- Select the duplicate user.
Milestone 3 — AI Tools & Automations
1. Enable LinearB AI Services & AI Iteration Summary
Why
Essentials includes AI-powered features that help leaders understand team performance and improve delivery.
To unlock these capabilities, both AI Services (org-wide) and AI Iteration Summary (team-level) must be enabled.
Step 1 — Enable LinearB AI Services (Org-Level)
Who can do this: Company Admin
This toggle activates AI capabilities across your LinearB organization, enabling:
- gitStream AI actions (AI code review, PR descriptions, summaries)
- AI Iteration Summary (automated sprint summaries delivered to Slack or MS Teams)
- AI-enhanced insights in dashboards and reports
Steps
- Go to Company Settings → General.
- Scroll to the AI Services section.
- Toggle Enable LinearB AI Services to ON.
Effects of disabling or not enabling AI Services
If you turn off AI Services, the following behavior applies across LinearB:
- AI Automations (AI Review, AI Description) stop running.
- AI Insights continues to display data, but LinearB-generated AI metrics (AI Review, AI Description, Iteration Summary) no longer populate.
- GitHub Copilot and Cursor metrics dashboards continue reporting activity, since they use external APIs rather than LinearB AI Services.
- Dependabot automations will not run.
- Other dashboards (DORA, Delivery, Quality, Throughput) continue to function, but without AI-driven enhancements (eg., AI Iteration Summary).
Step 2 — Enable AI Iteration Summary Notifications (Team-Level)
Who can do this: Company Admin or Team Admin
AI Iteration Summary automatically generates insights for each team’s iteration, including progress trends, risk indicators, PR activity, and blockers. Summaries are delivered to Slack or MS Teams automatically at the end of each iteration cycle.
Steps
- Open Settings in the LinearB app.
- Select the team from the Teams list.
- Go to the Insights tab.
- Toggle AI Iteration Summary Notification to ON.
Requirements
- Slack or MS Teams must be connected (see Communication Apps section).
- The team must have iteration dates defined (Scrum teams).
Example AI Iteration Summary
Team Iteration Summary – Team Autobots
12 Completed | 4 In Progress | 3 Carryover
1 blocker flagged this iteration
Velocity down 8% vs. previous sprint
[View Full Report]
2. AI Tools Configuration
Why
Essentials provides visibility into AI-assisted development, so you can measure usage, adoption, and the downstream impact of AI-generated code.
Supported AI tools
- GitHub Copilot
- Cursor
Where to configure
Go to Company Settings → AI Tools to enter API tokens for Copilot and/or Cursor. Once configured, LinearB automatically detects AI contributions and attributes them to developers.
Ensure the required scopes are checked:
read:org,
read:user,
manage_billing:copilot
Outcome: AI activity is tracked per developer, included in PR analysis, and available to gitStream rules.
3. gitStream: Managed Mode vs Self-Managed Mode
Why
gitStream powers automation in Essentials, including PR labeling, AI review, automated approvals, and low-noise workflows.
Modes
Managed Mode (Default)
This mode requires no YAML configuration. Automations are defined directly inside LinearB (see screenshot reference). Managed Mode runs on the LinearB runner and applies to all selected repositories.
Where to enable:
Go to Company Settings → AI Tools → Managed Automations.
Available automations include:
- AI Review
- AI Description
- Label Agents as Co-authors
- Estimated Time to Review
- Dependabot Minor Bump Approvals
- Dependabot Patch Bump Approvals
Outcome: One-click automation with no setup required.
Self-Managed Mode
Ideal for teams who want full control over automation logic using gitStream YAML files. Automations run directly on your Git provider and follow explicit/implicit triggers.
How it works
- Create a
.cmdirectory in your repository. - Add YAML automation files (e.g.,
auto-review.yml,pr-policy.yml). - Configure triggers and conditions.
- Commit and push.
Full setup guides:
gitStream Hub (Installation Guides)
Outcome: Deep, customizable automations fully controlled in your Git repos.
Milestone 4 — DORA Metrics (Optional)
1. Configure DORA Metrics
Why
Essentials supports DORA metrics when you provide deployment and/or incident data.
MTTR and CFR require incident data via API, but deployment detection itself can be configured in multiple ways.
DORA metrics available in Essentials:
- Deployment Frequency
- Lead Time for Changes (Cycle Time)
- Change Failure Rate (CFR) — requires incidents via API
- Mean Time to Recovery (MTTR) — requires incidents via API
4.1 Release Detection (Required for Deployment Metrics)
LinearB supports four methods to detect releases. Choose the one that matches your deployment workflow.
Configure Release Detection (4 supported methods)
Why
Releases define Deployment Frequency, Lead Time, and accurate cycle-time closure.
LinearB release detection triggers:
Method A — Release by Tag (default)
- Most popular option if you tag your releases.
- Triggered when a Git push occurs to a tagged commit.
- Every commit appearing in the tagged branch is considered released.
- Supports regular expressions (regex) to filter tag names by prefix/suffix.
Method B — Release by PR to a dedicated branch
Note: This method causes Deploy Time to register as 0 (deploy time from PR approval to deploy is effectively skipped).
- LinearB listens for PRs merged into dedicated branch(es) to define a release.
- Any PR merged to a release branch (for example,
main) is treated as a deploy. - Set which branch(es) LinearB should listen to using a regex expression.
- Default expression:
^(main|master|production|prod)$
Method C — Release by direct merge to a branch
- Used when your org merges directly into a dedicated release branch without using PRs.
- LinearB detects the merged commit and marks included branches as deployed at time of merge.
- Set which branch(es) to listen to by regex.
- Default expression:
^(main|master|production|prod)$
Method D — Release by Deployment API
- Report deployments and release events from your CI/CD system.
- Best for teams relying on CI/CD for release management and seeking deployment metric accuracy.
- Supports multi-stage release detection with LinearB’s custom stages.
- Deployments are matched to merged branches using Git ancestry (i.e., is this branch’s commit an ancestor of the deployed commit?).
Note: Please follow this link to learn more about the Deployment API.
Where to configure release detection
Set the organization-wide release detection method
- In LinearB, go to Company Settings → Advanced.
- In the Release Detection section, select your desired release detection method.
- Click Save to apply your changes.
Override release detection per team
Individual teams can use a different release detection method than the organization default. For example, the org may use tag-based releases, while a specific team uses the Deployment API.
- In LinearB, select the team that should use a different method from the team selector (top right).
- Go to Settings → Team Settings.
- On the Team Settings page, open the Advanced tab.
- In the Release Detection section, select the desired release detection method for this team.
- Click Save to apply your changes.
Outcome: Deployment Frequency and Lead Time metrics begin populating once releases are detected.
4.2 Incident Tracking (Required for CFR & MTTR)
Important: Essentials supports incident tracking through API only.
To unlock:
- Change Failure Rate (CFR)
- Mean Time to Recovery (MTTR)
Where to configure:
Go to Company Settings → Incident Detection and select API Integration.
To set up the Incident API:
- Generate a LinearB API token for your workspace under Company Settings → API Tokens.
- In Company Settings → Incident Detection, confirm API Integration is selected and saved.
- Configure your incident tool or pipeline (PagerDuty, Opsgenie, custom system, etc.) to send incident events to the LinearB Incident API endpoint using your API key.
- Send:
- a create call when the incident is opened, and
- update calls as work starts and finishes (including
started_atandended_attimestamps).
You must send incident lifecycle data via the Incident API, including at minimum:
provider_id(ID from your incident system)titlehttp_url(link back to the incident)issued_at, and laterstarted_at/ended_at(ISO 8601 timestamps)
For CFR accuracy, LinearB automatically maps incidents to deployments using commit SHA ancestry and your chosen release detection method (API, tags, branches, or GitHub deployments).
Note: Please follow this link to learn more about the Incident API.
Outcome: CFR and MTTR populate once incidents and releases are both provided.
4.3 How LinearB Maps PRs → Deployments → Incidents
- LinearB evaluates the commit SHA or
ref_nameof each deployment. - For each branch in the merged state, LinearB checks ancestry using Git’s
is_ancestorlogic. - If the PR’s tip commit is included in the deployed ref, the PR is marked as deployed.
- If an incident is linked to a deployment, CFR increments and MTTR is calculated.
Outcome: Accurate DORA metrics with zero manual mapping.
Verify Essentials Setup
Checklist
You should now see the following working as expected:
- Git integration syncing
- Correct branch and file exclusions applied
- User access configured
- SSO working (if enabled)
- gitStream installed and active
- AI usage data appearing for Copilot/Cursor
- DORA data showing in Metrics → Metrics Dashboard
Your Essentials environment is now fully configured.
Next steps:
How did we do?
Get Started - LinearB Enterprise Setup Guide
LinearB Metrics Glossary