Skip to content

Measurements V2

LinearB customers on Business or Enterprise plans have the option to export Metrics data that has already been processed by the platform and leverage this information to:

  • Create custom reports
  • Export to other data visualization tools

Limitations

LinearB measurements API is currently limited to Git metrics and AI Tools metrics PM metrics like Velocity/Investment Profile/Time Distribution are not supported at this point.

Usage

The Measurements API is designed to operate in two ways.

  1. Integration - The request generate a report and provides the report in JSON format in the response

  2. Export - The request generates a report and drops it as a CSV or JSON file in Amazon S3 bucket. A link for the file is returned in the response

Info

If the request filter provided end up with no data, then also the system will not generate a report and provide a response with the code 204 (No Content).

Create measurements report (JSON)

This API endpoint should be used to instantly obtain metrics data from LinearB. It offers the capability to retrieve all available metrics. It provides the ability to retrieve all available metrics, and for those that support it, you can opt for aggregation methods such as p75, p50, and average calculations.

HTTP request

POST  https://public-api.linearb.io/api/v2/measurements

Body parameters

Required Fields

Field Type Description
requested_metrics array Minimum 1 metric required
time_ranges array Minimum 1 time range required

Optional Fields

Field Type Description
group_by string See Group By Values below (default: organization)
roll_up string 1d, 1w, 1mo, custom (default: custom)
order_by string Must exist in requested_metrics
order_dir string asc (default) or desc
limit integer Pagination limit (> 0)
offset integer Pagination offset (>= 0)
return_no_data boolean Return nullable values instead of dropping
with_direct_children boolean Include direct children in team hierarchy

Filtering Fields

Field Type Constraints
team_ids int[] 1–50 items (required if group_by=team)
contributor_ids int[] 1–50 items
repository_ids int[] 1–50 items
service_ids int[] 1–50 items
labels string[] Max 3 items

AI Filters

Field Type Description
dev_tool_ids int[] Filter specific AI tool IDs

Workflow Grouping

Additional group_by values:

Value Description
coding_assistant Group by AI coding tools (returns dev_tool_id)
ai_review Group by AI review tools
agentic_pr Group by AI branch/PR creation tools

When using these values, response groups are:

  • integer → dev_tool_id

  • “manual” → non-AI work

Workflow Filters

Filter which AI workflow tools and/or manual work to include.

{
  "workflows_filters": {
    "coding_assistant": {
      "values": [26],
      "all": false,
      "manual": false
    }
  }
}
Allowed Keys
  • coding_assistant

  • ai_review

  • agentic_pr


Sub-Fields
Field Type Description
values int[] Specific dev_tool_ids (min 1 item if provided)
all boolean Include all tools of that workflow type
manual boolean Include manual (non-AI) bucket

Manual Bucket Behavior

Manual bucket represents work without AI involvement for that workflow type.

Manual bucket IS included when:
  • No workflows_filters specified

  • manual: true specified

Manual bucket is NOT included when:
  • workflows_filters present but manual is not true

Cross-Workflow Filtering (Intersection Logic)

You may filter by a workflow type different from the one grouped by.

Example:

{
  "group_by": "ai_review",
  "workflows_filters": {
    "coding_assistant": { "values": [11] }
  }
}

Meaning:

  • Group by AI review tools

  • Only include branches that also used coding assistant tool 11

Filters act as an AND intersection.


Additional Group By Settings (Top N)

Limit returned workflow groups:

{
  "additional_group_by_settings": {
    "order_by": "commit.total.count",
    "max_items": 5
  }
}

Rules:

  • order_by must exist in requested_metrics

  • max_items must be greater than 0

  • Manual bucket (if included) is always appended


Time Ranges

"time_ranges": [
  { "after": "2026-01-01", "before": "2026-02-01" }
]

Rules:

  • Format must be yyyy-mm-dd

  • before must be greater than after

  • Multiple ranges allowed only when roll_up = "custom"

Supported Metrics

Branch Metrics

Name Aggregation Description Units
branch.time_to_pr p75, p50, p90, avg Coding Time - Time from first commit on a branch to PR creation Minutes
branch.time_to_approve p75, p50, p90, avg Time from PR creation to first approval Minutes
branch.time_to_merge p75, p50, p90, avg Time from approval to merge Minutes
branch.time_to_review p75, p50, p90, avg Pickup Time - Time from PR creation to first review Minutes
branch.review_time p75, p50, p90, avg Review Time - Total time spent in review Minutes
branch.time_to_prod p75, p50, p90, avg Deploy Time - Time from merge to production deployment Minutes
branch.computed.cycle_time p75, p50, p90, avg Full cycle time (coding + pickup + review + production) Minutes
branch.cycle_time.sum_of_components p75, p50, p90, avg Sum of all cycle time stage components Minutes
branch.state.computed.done Number of branches that reached done state Count
branch.state.active Number of active branches Count
branch.commit.coauthor Branches containing co-authored commits Count
branch.stage.time_to_* p75, p50, p90, avg Custom stage duration metric (e.g., branch.stage.time_to_staging) Minutes

PR Metrics

Name Aggregation Description Units
pr.merged Number of merged PRs Count
pr.merged.size p75, p50, avg Size of merged PRs Lines of Code
pr.new Number of opened PRs Count
pr.review_depth Total review comments divided by total PRs Ratio
pr.reviews Total number of PR reviews Count
pr.merged.without.review.count PRs merged without review Count
pr.maturity_ratio Ratio of mature PRs based on internal maturity definition Ratio
pr.reviewed PRs that received at least one review Count

Commit Metrics

Name Aggregation Description Units
commit.total.count Total number of commits Count
commit.activity.new_work.count The total lines of code that have been replaced Lines of Code
commit.activity.refactor.count The total lines of code that have been replaced that are older then 25 days Lines of Code
commit.activity.rework.count The total lines of code that have replaced code written within the last 25 days, but outside this branch Lines of Code
commit.total_changes Total lines added and deleted Lines of Code
commit.activity_days Number of days with commit activity Days
commit.involved.repos.count Number of repositories with commits Count
commit.code_churn.rework Code churn from rework activity Lines of Code
commit.code_churn.refactor Code churn from refactoring activity Lines of Code
contributor.coding_days Number of days contributor committed code

GitStream / AI Metrics

Name Aggregation Description Units
commit.total_count.gitstream.suggestion Commits generated with gitStream suggestions Count
commit.total_changes.gitstream.suggestion Code changes from gitStream suggestions Lines of Code
gitstream.ai.review.total.count Total AI reviews generated Count
gitstream.ai.review.pr.count PRs that received AI review Count
gitstream.ai.review.security_issues.prs.count PRs where AI detected security issues Count
gitstream.ai.review.bugs.prs.count PRs where AI detected bugs Count
gitstream.ai.review.performance_issues.prs.count PRs where AI detected performance issues Count
gitstream.ai.review.readability_issues.prs.count PRs where AI detected readability issues Count
gitstream.ai.review.maintainability_issues.prs.count PRs where AI detected maintainability issues Count

Other Metrics

Name Aggregation Description Units
releases.count Number of releases Count
pm.mttr avg Mean time to repair Minutes
pm.cfr.issues.done The sum of issues that are considered as incidents that reached a done state Count

AI Dev Tools – ID to Name Mapping

ID Name
1 aider
2 aikido
3 amazonq
4 atlassian_code_reviewer
5 automation
6 bito
7 build
8 changeset-bot
9 ci
10 circleci
11 claude
12 cloudflare-pages
13 codacy
14 codeant
15 codeclimate
16 codecov
17 codefactor
18 codegen
19 coderabbit
20 codex
21 codota
22 copilot
23 coveralls
24 cubic.dev
25 currents
26 cursor
27 cypress
28 deepsource
29 dependabot
30 deployment
31 devin-ai
32 distiller
33 ellipsis
34 factory.ai
35 fine
36 gas
37 gemini
38 gitguardian
39 github_bot
40 github-actions
41 github-code-scanning
42 gitlab_duo
43 gitlab_security
45 google_jules
46 graphite
47 greptile
48 hiyabot
49 jazzberry
50 jenkins
51 jetbrains_junie
52 jit
53 korbit
54 linear.app
55 linearb_automation
56 mend
57 mergify
58 meticulous
59 netlify
60 notion
61 nx-cloud
62 opencode
63 orca
64 qlty
65 qodana
66 qodo
67 renovate
68 replit
69 rovo
70 runner
71 runway
72 semantic-release-bot
73 semgrep
74 sentry
75 shortcut
76 smartling
77 snyk
78 sonarcloud
79 sourcegraph
80 sourcery
81 swarmia
82 sweep-ai
83 swimm
84 tabnine
85 tusk
86 typo-app
87 ubuntu
88 vercel
89 what-the-diff
90 windsurf
91 firefly-app
92 linearb-AI

Export Measurements Report

HTTP Request

POST https://public-api.linearb.io/api/v2/measurements/export

Query Parameter

Parameter Values Default
file_format csv, json csv

Optional body addition:

Field Type Description
beautified boolean Format CSV output for readability

Export Response

{
  "report_url": "https://...",
  "detail": "You can download your file using the link. TTL: 2 days"
}

Examples

Organization-level cycle time (weekly)

{
  "group_by": "organization",
  "roll_up": "1w",
  "requested_metrics": [
    { "name": "branch.computed.cycle_time", "agg": "p75" },
    { "name": "branch.computed.cycle_time", "agg": "avg" }
  ],
  "time_ranges": [
    { "after": "2026-01-01", "before": "2026-02-01" }
  ]
}

Team comparison (monthly)

{
  "group_by": "team",
  "roll_up": "1mo",
  "team_ids": [5273, 58],
  "requested_metrics": [
    { "name": "branch.time_to_prod", "agg": "p50" },
    { "name": "branch.time_to_pr", "agg": "avg" }
  ],
  "time_ranges": [
    { "after": "2026-01-01", "before": "2026-02-01" }
  ]
}

Repository-level metrics (daily)

{
  "group_by": "repository",
  "roll_up": "1d",
  "repository_ids": [456801317, 1235235],
  "requested_metrics": [
    { "name": "branch.time_to_prod", "agg": "p50" }
  ],
  "time_ranges": [
    { "after": "2026-01-01", "before": "2026-02-01" }
  ]
}

Multiple time ranges (custom rollup)

{
  "group_by": "organization",
  "roll_up": "custom",
  "requested_metrics": [
    { "name": "branch.computed.cycle_time", "agg": "p75" },
    { "name": "releases.count" }
  ],
  "time_ranges": [
    { "after": "2026-01-01", "before": "2026-01-15" },
    { "after": "2026-01-15", "before": "2026-02-01" }
  ]
}

Coding assistant breakdown with all tools + manual

{
  "group_by": "coding_assistant",
  "roll_up": "1mo",
  "team_ids": [100],
  "workflows_filters": {
    "coding_assistant": { "all": true, "manual": true }
  },
  "requested_metrics": [
    { "name": "commit.total.count", "agg": "avg" }
  ],
  "time_ranges": [
    { "after": "2026-01-01", "before": "2026-02-01" }
  ]
}

Top 5 coding assistant tools

{
  "group_by": "coding_assistant",
  "roll_up": "1mo",
  "team_ids": [100],
  "additional_group_by_settings": {
    "order_by": "commit.total.count",
    "max_items": 5
  },
  "requested_metrics": [
    { "name": "commit.total.count", "agg": "avg" }
  ],
  "time_ranges": [
    { "after": "2026-01-01", "before": "2026-02-01" }
  ]
}

AI review tools for branches that used a specific coding assistant

{
  "group_by": "ai_review",
  "roll_up": "1mo",
  "team_ids": [100],
  "workflows_filters": {
    "coding_assistant": { "values": [11] }
  },
  "requested_metrics": [
    { "name": "pr.reviews", "agg": "avg" }
  ],
  "time_ranges": [
    { "after": "2026-01-01", "before": "2026-02-01" }
  ]
}

Ordered results with pagination

{
  "group_by": "contributor",
  "roll_up": "1mo",
  "team_ids": [100],
  "order_by": "commit.total.count",
  "order_dir": "desc",
  "limit": 10,
  "offset": 0,
  "requested_metrics": [
    { "name": "commit.total.count", "agg": "avg" }
  ],
  "time_ranges": [
    { "after": "2026-01-01", "before": "2026-02-01" }
  ]
}

Create measurements report with time ranges

{
  "group_by": "organization",
  "roll_up": "custom",
  "requested_metrics": [
    {
      "name": "branch.computed.cycle_time",
      "agg": "p75"
    },
    {
      "name": "releases.count"
    }
  ],
  "time_ranges": [
    {
      "after": "2022-05-27",
      "before": "2022-05-29"
    },
    {
      "after": "2022-05-30",
      "before": "2022-06-05"
    },
    {
      "after": "2022-06-06",
      "before": "2022-06-12"
    },
    {
      "after": "2022-06-13",
      "before": "2022-06-19"
    }
  ]
}

Create measurements report before and after

{
  "group_by": "organization",
  "roll_up": "1w",
  "requested_metrics": [
    {
      "name": "branch.computed.cycle_time",
      "agg": "p75"
    },
    {
      "name": "branch.computed.cycle_time",
      "agg": "avg"
    }
  ],
  "time_ranges": [
    {
      "after": "2022-05-27",
      "before": "2022-06-29"
    }
  ]
}

Create measurements report for specific repositories

{
  "group_by": "organization",
  "roll_up": "1d",
  "repository_ids": [
    456801317,
    1235235
  ],
  "requested_metrics": [
    {
      "name": "branch.time_to_prod",
      "agg": "p50"
    }
  ],
  "time_ranges": [
    {
      "after": "2022-05-27",
      "before": "2022-06-29"
    }
  ]
}

Create measurements report for specific teams

{
  "group_by": "team",
  "roll_up": "1mo",
  "team_ids": [
    5273,
    58
  ],
  "requested_metrics": [
    {
      "name": "branch.time_to_prod",
      "agg": "p50"
    },
    {
      "name": "branch.time_to_pr",
      "agg": "avg"
    }
  ],
  "time_ranges": [
    {
      "after": "2022-05-27",
      "before": "2023-06-29"
    }
  ]
}

Specific Coding Assistant Tool

{
  "group_by": "coding_assistant",
  "roll_up": "1w",
  "time_ranges": [
    {
      "after": "2025-12-01",
      "before": "2026-01-31"
    }
  ],
  "with_direct_children": true,
  "workflows_filters": {
    "coding_assistant": {
      "values": [26],
      "all": false,
      "manual": false
    }
  },
  "requested_metrics": [
    { "name": "pr.new" }
  ]
}

Behavior:

  • Groups by coding assistant
  • Includes only dev_tool_id = 26
  • Manual bucket excluded
  • Weekly aggregation

Responses

200 - Successful Response

    [
        {
        "after": "2022-05-27",
        "before": "2022-05-29",
        "metrics": [
          {
            "organization_id": 1697464851,
            "branch.computed.cycle_time:p75": 2872,
            "releases.count": 8
          }
        ]
          },
          {
            "after": "2022-05-30",
            "before": "2022-06-05",
            "metrics": [
              {
                "organization_id": 1697464851,
                "branch.computed.cycle_time:p75": 8048,
                "releases.count": 35
              }
            ]
          },
          {
            "after": "2022-06-06",
            "before": "2022-06-12",
            "metrics": [
              {
                "organization_id": 1697464851,
                "branch.computed.cycle_time:p75": 45333,
                "releases.count": 39
              }
            ]
          },
          {
            "after": "2022-06-13",
            "before": "2022-06-19",
            "metrics": [
              {
                "organization_id": 1697464851,
                "branch.computed.cycle_time:p75": 2857,
                "releases.count": 34
              }
            ]
          }
    ]

400 - Bad Request

401 - Unauthorized

405 - Method Not Allowed

422 - Validation Error

500 - Internal Server Error