Team Benchmarks
Cross-team comparison for learning, not ranking
Scrum Master Guide: Team Benchmarks
Team Benchmarks provide performance comparison data across teams. The purpose is learning, not competition โ a team with lower velocity but higher code quality and fewer incidents may be healthier than a team shipping fast with high failure rates. Use benchmarks to start conversations about what's working and identify where teams can learn from each other.
Velocity (story points)
Cycle Time (days)
lower is betterPR Review Time (hours)
lower is betterDeployments / Week
DevEx Score
All Teams โ Benchmark Data
| Team | Velocity | Cycle Time | PR Review | Deploys/wk | DevEx |
|---|---|---|---|---|---|
| Platform | โ 62 pts | 3.2 d | 6 h | 4 | 78 |
| Mobile | 45 pts | 4.8 d | 12 h | 2 | 65 |
| Data | 38 pts | โ 2.1 d | โ 4 h | โ 6 | 72 |
| Frontend | 54 pts | 2.8 d | 5 h | 5 | โ 81 |
API Explorer
Facilitating Benchmark Conversations
Share one benchmark comparison per retro. Ask: 'Data team deploys 6x/week while Mobile deploys 2x โ what can we learn from their process?' Avoid framing as 'Mobile needs to catch up.'
Present benchmarks alongside context. High velocity with high cycle time may mean large batches โ not necessarily a problem, but worth understanding.
Use benchmarks to identify mentoring opportunities. A team excelling in PR review time can share their approach with teams where reviews are a bottleneck.