Skip to main content

AI Tool Detection and Workflow Mapping

LinearB detects AI-assisted development activity and classifies it into workflow categories to enable reporting, filtering, and analysis. This article explains how AI tools are identified, how they a…

Steven Silverstone
Updated by Steven Silverstone

LinearB detects AI-assisted development activity and classifies it into workflow categories to enable reporting, filtering, and analysis.

This article explains how AI tools are identified, how they are grouped into workflows, and how this impacts metrics and reporting across LinearB.

Important:
AI detection in LinearB does not rely solely on direct integrations. Detection is based on a combination of signals, internal mappings, and workflow classification.

How AI tools are detected

LinearB identifies AI tool usage through multiple mechanisms, including:

  • Git activity signals (commit metadata, PR patterns, co-authors, and comments)
  • Tool-specific integrations where available
  • Internal classification and mapping of development tools

These signals are combined to determine whether a change involved AI assistance and which tool was likely used.


Workflow categories

Detected AI activity is grouped into workflow categories to enable consistent reporting and filtering.

Coding Assistant (coding_assistant)

Tools that assist developers in writing or generating code.

  • Examples: Copilot, Cursor, Claude, Codex, Tabnine
  • Used for: code generation, inline suggestions, assisted development

AI Review (ai_review)

Tools that analyze pull requests and provide feedback, comments, or issue detection.

  • Examples: CodeRabbit, Codacy, DeepSource, SonarCloud
  • Used for: PR analysis, bug detection, performance and security insights

Agentic PR / Branch Creation (agentic_pr)

Tools that autonomously generate changes, branches, or pull requests.

  • Examples: Devin, Sweep AI, automation bots
  • Used for: automated PR creation, refactoring, workflow-driven changes

Manual (non-AI) work

LinearB also tracks work that does not involve AI tools.

  • Represented as a “manual” bucket in reporting
  • Used for comparison against AI-assisted activity

Manual work is included by default unless explicitly filtered out.


How workflow mapping affects reporting

Workflow classification is used across multiple areas of LinearB:

  • AI Insights: Analyze adoption and usage patterns by workflow type
  • Filtering: Filter metrics by coding assistants, review tools, or agentic tools
  • Grouping: Group results by AI workflow category or specific tools
  • API usage: Query metrics using workflow-based grouping and filters

Cross-workflow filtering

LinearB supports filtering across workflow types using intersection logic.

Example:

  • Group by AI review tools
  • Filter to include only work that also used a specific coding assistant

This allows you to answer questions such as:

  • Which AI review tools are used alongside a specific coding assistant?
  • How does AI-assisted coding impact review behavior?

Tool identification

Each detected tool is mapped internally to a unique identifier (dev_tool_id).

These identifiers are used in:

  • Measurements API queries
  • Workflow filters
  • Exported reports

Limitations

  • Detection is based on available signals and may not capture all AI usage
  • Some tools may be grouped under internal or generic classifications
  • Not all detected tools are configurable within LinearB
  • AI detection coverage may evolve over time as new tools emerge

How did we do?

Contact