Using Team Iteration Summary in Pulse
The Team Iteration Summary in Pulse provides a concise, AI-generated overview of your team’s performance across a completed iteration. It highlights key accomplishments, delivery insights, and missed…
The Team Iteration Summary in Pulse provides a concise, AI-generated overview of your team’s performance across a completed iteration. It highlights key accomplishments, delivery insights, and missed goals—combining real-time Git activity with project management data to generate clear metrics and actionable recommendations. This view helps teams quickly review completed tasks and achievements, identify areas for improvement, and enables easy sharing and feedback collection to support continuous improvement.

Prerequisites
Before using the Team Iteration Summary feature, ensure the following:
- The Team must be connected to a Jira or Shortcut project board. This enables Pulse to map Git activity to PM issues and generate accurate iteration summaries.
- The Team board must be configured as a Scrum board. Pulse relies on Scrum board configurations to determine iteration boundaries and completion states.
- The Team Iteration Summary feature must be enabled.
- You must have the necessary permissions to view team data in the Pulse View.

Accessing the Team Iteration Summary
To access the Team Iteration Summary, follow these steps:
- Go to the Projects tab in the left-hand menu, and select Pulse.
- Select a past iteration from the top iteration dropdown (e.g., "25IT05 – Brasília").
- The Pulse View will automatically switch to Retro View, and show the Delivery Breakdown View by default.


Delivery Breakdown View
The Delivery Breakdown View is the default panel displayed when opening the Team Iteration Summary in Pulse (in Retro View). This section presents a comprehensive breakdown of all the work completed, uncompleted, and added during the iteration — organized by category and linked directly to Git activity and PM issues.
Each line item provides visibility into the actual work done, not just what was planned. It helps teams assess whether they delivered what they committed to, spot unexpected work, and evaluate delivery consistency across contributors.
The Delivery Breakdown View displays the following details:
- Issue Groupings:
- Planned & Completed – PM issues that were scoped at the start of the iteration and successfully delivered.
- Planned & Uncompleted – Issues that were scoped but not completed within the iteration.
- Added Work & Completed – Issues that were not originally part of the iteration plan but were completed during the sprint.
- Added Work & Uncompleted – Unplanned issues that were added mid-iteration but not completed.
- Assignee – The team member responsible for each task or issue.
- Story Points – Displays the assigned point value, helping track estimation accuracy and effort distribution.
- Issue ID & Title – Directly clickable, providing instant access to the full issue or pull request (e.g., via Jira or GitHub).
- PM System Icons – Shows whether the item originates from Jira or another PM system (e.g., Shortcut).
- Delivery Tags – Indicates if the item is a bug, story, task, or other issue type, based on metadata from your PM system.
- Filtering Options – The top of the panel allows filtering by issue type (Story, Task, Bug), helping users zero in on relevant work.


AI Iteration Summary View
The AI-generated Iteration Summary provides a structured, plain-language overview of your team’s sprint performance. It consolidates key delivery insights into a single panel, making it easy to share accomplishments, identify bottlenecks, and support retrospective discussions. Each section is tailored to highlight planning accuracy, delivery consistency, and workflow patterns—offering a holistic snapshot of how the iteration unfolded.

Accessing the AI Iteration Summary View
To access the AI Iteration Summary, click the View AI Iteration Summary button. This opens the summary panel that provides AI-generated insights based on the completed iteration.



Viewing the AI Iteration Summary
The AI Iteration Summary view is divided into structured segments that give both high-level and granular insight into your team’s delivery patterns:
Iteration Name and Dates
Located at the top of the panel, this shows the name of the selected iteration and its start/end dates.
Overview
A short narrative that summarizes the sprint’s delivery themes. This includes:
- The balance between planned and unplanned work.
- Capacity vs. planning accuracy.
- Notable accomplishments or challenges that impacted delivery.
Example:
The team completed significant PM Connectors and Vision-related work with a Planning Accuracy of 33% and Capacity Accuracy of 64%. The combination of planned feature work and unplanned production fixes resulted in lower Planning Accuracy while maintaining reasonable capacity utilization.
Key Accomplishments
Highlights up to 10 major deliverables, including:
- Successfully delivered features (with Story Point values).
- Unplanned improvements that added value mid-iteration.
- Infrastructure upgrades that supported team velocity.
Example:
- Delivered GitHubIssuesConnector functionality (1 SP)
- Implemented Azure PDT triggers (3 SP)
- Vision iteration summary monitoring (0.5 SP)
- Throttling Vision triggers (1 SP)
- TWA code analysis fixes (2 SP)
Planning Disruptions
Lists challenges that impacted predictability and scope, such as:
- Unplanned Work – Issues added mid-sprint and completed.
- Untracked Branches – Code pushed to branches without linked issues.
- Unexpected blockers or coordination issues.
Example:
10 items of unplanned work, including:
- GitHub Issues Client data collection (LINBEE-13097)
- TWA code change fixes (LINBEE-15316)
- Shortcut org failures in Databricks (LINBEE-15319)
- 10 untracked branches from 5 contributors
PR Flow Analysis
Analyzes the quality and flow of code review and merge activity. Includes:
- Pull requests with minimal review.
- Delays in merging.
- Large PRs merged without adequate vetting.
Example:
- PR pm-connectors/590: 460 lines merged in 83 minutes with only 2 minutes of review
- PR linta/1421: 138 lines, merged after 67 hours with a 1-minute review
AI-Generated Recommendations
Actionable guidance designed to improve planning and execution in future sprints:
- Reduce scope to increase delivery predictability.
- Set up automation to ensure all PRs are linked to a tracked issue.
- Identify focus areas like planning accuracy or untracked work for team retros.
Example:
- Reduce planned scope by 20% (approx. 4 SP) to improve forecast accuracy.
- Automate PR tracking to flag unlinked branches and reduce shadow work.

Sharing the Iteration Summary
The iteration summary can be shared in two formats:
- Deep Link (Access Restricted by Role-Based Permissions)—Generates a shareable link for team members with the required permissions.
- Markdown Text—Copies the summary text directly to the clipboard for quick sharing or documentation purposes.

Providing Feedback
At the bottom of the summary panel, users can provide feedback to help improve the AI-generated summaries:
- 👍 / 👎 Like/Dislike Button—Feedback is recorded, including the iteration details, team information, and insights, and sent to the system to help LinearB improve the AI engine.

Related Links
How did we do?
Strategies to Identify and Reduce High Pickup Time
What do I do if my review time is high?