Table of Contents
API - Measurements v2
Use the Measurements V2 API to export metrics that LinearB has already computed from your Git data. You can use these reports to build custom dashboards, power external analytics, or feed data into y…
Use the Measurements V2 API to export metrics that LinearB has already computed from your Git data. You can use these reports to build custom dashboards, power external analytics, or feed data into your own BI tools.
TL;DR
- Endpoint (JSON report):
POST https://public-api.linearb.io/api/v2/measurements - Endpoint (export to file):
POST https://public-api.linearb.io/api/v2/measurements/export - Reports support grouping and filtering by organization, team, contributor, repository, and services.
- Only Git-based metrics are supported; project management (PM) metrics like Velocity and Investment Profile are not included.
- If the filters return no data, the API responds with
204 No Contentand no report is generated.
Overview
Measurements V2 lets you retrieve Git-derived metrics that LinearB has already processed, so you can:
- Create custom reports and dashboards.
- Export metrics to CSV or JSON for use in external tools.
- Slice metrics by organization, team, contributor, repository, and labels.
Limitations:
- Measurements V2 is currently limited to Git metrics.
- PM metrics such as Velocity, Investment Profile, and Time Distribution are not supported.
The API works in two modes:
- Integration (JSON): Generate a report and receive the data directly in the JSON response.
- Export (file): Generate a report and export it as a JSON or CSV file to S3. The API returns a URL for download.
Important: If your filters result in no data, LinearB returns 204 No Content and does not generate a report.
Before You Begin
Authentication
All API requests require an API token.
- Go to Company Settings → API Tokens.
- Create (or reuse) an API token.
- Include it in every request header:
Authorization: Bearer <your_api_token>
Supported data
Measurements V2 provides access to Git-based delivery and quality metrics, such as cycle time, time-to-production, PR volume, code changes, releases, and select incident-related values.
See the Supported metrics section below for exact metric names and aggregation options.
1. Create measurements report (JSON)
Use this endpoint when you want to retrieve measurements directly in the response as JSON. This is typically used for integrations and programmatic analysis.
HTTP request
POST https://public-api.linearb.io/api/v2/measurements
Body parameters
Required
requested_metrics(array of objects) – the list of metrics to include in the report.- Each object includes at least
name, and optionally an aggregation (agg). - Valid metric names are listed in Supported metrics.
- Each object includes at least
Optional
group_by(string) – field to group results by.- Valid values:
"organization","team","contributor","repository","label". - Label grouping is limited to a maximum of 10 repositories and 3 PR labels.
- Valid values:
order_by(string) – field to sort by.contributor_ids(array of integers) – filter by specific contributors.repository_ids(array of integers) – filter by specific repositories.team_ids(array of integers) – filter by specific teams.service_ids(array of integers) – filter by specific services.labels(array of strings) – filter or group by PR labels.- Restricted to 3 labels.
- If
group_by = "label", results are grouped by the specified labels. Otherwise, labels act as filters.
time_ranges(array of objects) – one or more time windows for the report.- Each object must include
afterandbeforeas date strings (e.g."2022-05-27"). - Use multiple ranges to produce a series of periods in one call.
- Each object must include
roll_up(string) – how to break down data over time.- Examples:
"2d"(two days),"1w"(one week),"1m"(one month), or"custom". "custom"uses yourtime_rangesas the intervals.
- Examples:
limit(integer) – maximum number of records to return.- When requesting multiple contributors, teams, or repositories,
limitshould be at least the number of IDs; otherwise, results may be partial.
- When requesting multiple contributors, teams, or repositories,
offset(integer) – offset for pagination.return_no_data(boolean) – iftrue, returns nullable values instead of dropping entries with no data.
Supported metrics
The following metrics can be requested in requested_metrics. For some metrics, you may specify an
aggregation (agg) such as "p75", "p50", or "avg".
| Name | Aggregation | Description | Units |
|---|---|---|---|
branch.computed.cycle_time |
p75, p50, avg |
Full cycle time (coding + pickup + review + time to production). | minutes |
branch.time_to_pr |
p75, p50, avg |
Time from first commit to PR creation (coding time). | minutes |
branch.time_to_review |
p75, p50, avg |
Time from PR creation to review start (pickup time). | minutes |
branch.review_time |
p75, p50, avg |
Time spent in active review. | minutes |
branch.time_to_prod |
p75, p50, avg |
Time from PR merge to deployment (time to production). | minutes |
pr.merged.size |
p75, p50, avg |
Total size of merged PRs. | lines of code |
pr.merged |
— | Number of PRs that were merged. | count |
pr.review_depth |
— | Average review comments per PR. | lines of comments |
commit.activity.new_work.count |
— | Total new lines of code. | count |
commit.total_changes |
— | Total lines of code changed. | lines of code |
commit.activity.refactor.count |
— | Lines of code replaced that are older than a threshold (refactor work). | lines of code |
commit.activity.rework.count |
— | Lines of code replacing recent code (rework). | lines of code |
pr.merged.without.review.count |
— | Number of PRs merged without review. | count |
commit.total.count |
— | Total commit count. | count |
pr.new |
— | Number of opened PRs. | count |
pr.reviews |
— | Number of PR reviews. | count |
releases.count |
— | Number of releases. | count |
commit.activity_days |
— | Number of days with developer activity (commits, comments, PRs, reviews). | days |
branch.state.computed.done |
— | Number of branches that reached done state. |
count |
branch.state.active |
— | Number of active branches. | count |
pm.mttr |
— | Mean time to repair. | minutes |
pm.cfr.issues.done |
— | Number of issues considered incidents that reached a done state. | count |
Examples
Create measurements report with custom time ranges
{
"group_by": "organization",
"roll_up": "custom",
"requested_metrics": [
{
"name": "branch.computed.cycle_time",
"agg": "p75"
},
{
"name": "releases.count"
}
],
"time_ranges": [
{ "after": "2022-05-27", "before": "2022-05-29" },
{ "after": "2022-05-30", "before": "2022-06-05" },
{ "after": "2022-06-06", "before": "2022-06-12" },
{ "after": "2022-06-13", "before": "2022-06-19" }
]
}
Create measurements report with a single time window
{
"group_by": "organization",
"roll_up": "1w",
"requested_metrics": [
{
"name": "branch.computed.cycle_time",
"agg": "p75"
},
{
"name": "branch.computed.cycle_time",
"agg": "avg"
}
],
"time_ranges": [
{
"after": "2022-05-27",
"before": "2022-06-29"
}
]
}
Create measurements report for specific repositories
{
"group_by": "organization",
"roll_up": "1d",
"repository_ids": [
456801317,
1235235
],
"requested_metrics": [
{
"name": "branch.time_to_prod",
"agg": "p50"
}
],
"time_ranges": [
{
"after": "2022-05-27",
"before": "2022-06-29"
}
]
}
Create measurements report for specific teams
{
"group_by": "team",
"roll_up": "1mo",
"team_ids": [
5273,
58
],
"requested_metrics": [
{
"name": "branch.time_to_prod",
"agg": "p50"
},
{
"name": "branch.time_to_pr",
"agg": "avg"
}
],
"time_ranges": [
{
"after": "2022-05-27",
"before": "2023-06-29"
}
]
}
Responses
200 — Successful response
Returns an array of time windows, each with its own metrics object.
[
{
"after": "2022-05-27",
"before": "2022-05-29",
"metrics": [
{
"organization_id": 1697464851,
"branch.computed.cycle_time:p75": 2872,
"releases.count": 8
}
]
},
{
"after": "2022-05-30",
"before": "2022-06-05",
"metrics": [
{
"organization_id": 1697464851,
"branch.computed.cycle_time:p75": 8048,
"releases.count": 35
}
]
}
]
Each metric key combines the metric name and aggregation (for example, branch.computed.cycle_time:p75).
Other status codes
- 204 No Content – No data for the specified filters; no report is generated.
- 400 Bad Request – Invalid payload or missing required fields.
- 401 Unauthorized – Authentication failed or token missing.
- 405 Method Not Allowed – HTTP method not supported for this endpoint.
- 422 Validation Error – Field-level validation issues.
- 500 Internal Server Error – Unexpected server-side error.
2. Export measurements report (file)
Use this endpoint to export metrics as a CSV or JSON file. The API returns an S3 URL where you can download the generated report. Files are stored with an expiration policy and may be removed after a certain period.
HTTP request
POST https://public-api.linearb.io/api/v2/measurements/export
Query parameter
file_format(string, optional)- Defines the format of the exported file.
- Valid values:
"json","csv". - Default:
"json".
Body parameters
The request body has the same structure as the JSON report endpoint, with the following notes:
requested_metrics(required, array of objects) – list of metrics to export.group_by(string) – field to group by.- Valid values:
"contributor","repository","team","organization".
- Valid values:
order_by(string) – field to sort by.contributor_ids(array of integers) – required ifgroup_by = "contributor".repository_ids(array of integers) – required ifgroup_by = "repository".team_ids(array of integers) – required ifgroup_by = "team".service_ids(array of integers) – required ifgroup_by = "service".time_ranges(array of objects) – should not be used if you separately definebeforeandafter; each range must include both.limit(integer) – maximum number of results; keep at least as large as the number of IDs you request.offset(integer) – pagination offset.return_no_data(boolean) – iftrue, returns rows with null values instead of dropping them.
Responses
200 — Successful response
{
"report_url": "string",
"detail": "string"
}
report_url is the S3 URL where the exported file can be downloaded. The file may expire based on
S3 lifecycle policies.
Other status codes
- 202 Accepted – Report generation request accepted but not ready yet.
- 204 No Content – No content; the report does not exist or has expired.
- 400 Bad Request – Invalid request body or query parameters.
- 401 Unauthorized – Missing or invalid API token.
- 405 Method Not Allowed – HTTP method not supported.
- 422 Validation Error – Validation issues in the request payload.
- 500 Internal Server Error – Server-side error while creating the report.
- 504 Gateway Timeout – Timeout while generating the report.
Verify & Troubleshooting
Verify a new measurements report
- Send a test request to
POST /api/v2/measurementswith a simple configuration (for example, one metric and a short time range). - Confirm you receive a
200response with at least one time window and metrics object. - If you receive
204 No Content, relax filters (for example, wider date range or fewer IDs) and try again. - Optionally, repeat the same body with
POST /api/v2/measurements/exportand verify that areport_urlis returned.
Quick fixes (most common issues)
- 204 No Content
– The filters returned no data for the selected time ranges.
– Try expanding the date range or removing some filters (IDs or labels). - Partial or truncated results
–limitmay be too low when requesting multiple contributors, teams, or repositories.
– Increaselimitso it is at least as large as the number of IDs you pass. - 401 Unauthorized
– Verify your API token is active and belongs to the correct organization.
– Confirm theAuthorization: Bearer <your_api_token>header is present on every request. - 422 Validation Error
– Check that required fields likerequested_metricsandtime_rangesare provided.
– Ensure each time range includes bothafterandbeforein valid date format.
– Confirm each metricnameis one of the supported metrics.
Advanced troubleshooting (problem → cause → fix)
| Problem | Cause | Fix |
|---|---|---|
| Report returns no metrics |
Time ranges or filters do not match any data (for example, before your organization started sending activity). Or the requested metrics are not applicable for the selected scope. |
Test with a broader time_ranges window and fewer filters.Start with a small set of well-known metrics like releases.count and branch.computed.cycle_time.
|
| Unexpected grouping behavior |
group_by set to a field that does not match provided IDs (for example, grouping by team but filtering only by repository_ids).Or label grouping used with more than the allowed number of repositories or labels. |
Align group_by with the IDs you pass (team_ids, repository_ids, etc.).Respect the limit of 10 repositories and 3 labels when grouping by labels. |
| Export URL no longer works | Exported report has expired based on S3 lifecycle policy, or the URL has been used after its retention period. |
Re-run POST /api/v2/measurements/export with the same filters to generate a new report and URL.
|
| Gateway timeout (504) on export | Very large report or heavy filters cause the export request to exceed processing time. |
Narrow the date range, reduce the number of IDs, or split the request into multiple, smaller queries. If possible, start with the JSON report endpoint to validate filter behavior before exporting. |
FAQ
Does Measurements V2 include PM metrics?
No. Measurements V2 is limited to Git-derived metrics. PM metrics such as Velocity and Investment Profile are not available via this API.
What time format should I use?
Use ISO 8601 date strings (for example, 2022-05-27) for after and before values in time_ranges.
Can I get both JSON and file export?
Yes. You can call /api/v2/measurements for JSON and /api/v2/measurements/export with the same body to generate a downloadable file.
What happens if I request too many IDs?
If limit is smaller than the number of contributors, teams, or repositories you request,
the response may be partial. Always set limit to at least the number of IDs you pass.
For additional reference, see the public API documentation: Measurements V2 API reference .
For additional information, please visit our external API documentation
For additional technical support, please contact LinearB support.
How did we do?
API - Incidents
API - Services