Intro
Most engineers resist metrics not because they fear accountability, but because they fear judgment without context. To build trust, you must prove that surveillance controls people, while measurement improves systems. This guide covers how to write a "metrics charter," choose DORA signals over activity counts, and create safety through transparency.
1. Define the Boundary
"Surveillance" isn't about collecting data; it's about shifting the power dynamic so people optimize for not getting in trouble.
Research shows that monitoring without explicit purpose increases stress without improving performance.
The Core Rule:
Metrics should create learning loops, not stress loops.
Good Practice:
Write a "Metrics Charter" that defines purpose, scope, access, and prohibitions. If you can't publish it to the team, you are running a surveillance program, not a measurement program.
2. Decide What You Are Optimizing For
Many programs fail because they start with instrumentation and end with interpretation. You need an explicit answer to: "What decision will this metric change?"
If the answer is vague ("better performance"), you are building surveillance with nicer language.
DORA research shows that a generative culture correlates with 30% higher organizational performance.
The Trap:
If your measurement increases fear, you might improve a local metric while degrading the system that produces performance. Define success as a balance of throughput and stability, not just speed.
3. Measure System Throughput
If you want measurement without cultural damage, measure the system, not the person.
Use throughput metrics that describe flow: tasks completed, reviews finished, and average cycle time. These signals tell you if work is moving smoothly from intake to release, not whether a specific human is "working hard."
Why this works:
Individual output is hard to interpret in collaborative work.
GitLab uses Merge Request (MR) Rate as a workflow artifact, not a surveillance tool.
The Fix:
Prefer metrics that locate bottlenecks: queue time, review turnaround, and batch size. Use them to ask "Where is work getting stuck?" rather than "Who is slow?"
4. Make Metrics Safe
Transparency is the antidote to fear. Research shows that transparent monitoring is associated with more positive worker attitudes.
The Safety Protocol:
- Transparency: Teams know what is measured and who sees it.
- Purpose: Every metric is attached to a learning loop.
- Non-Punitive: Metrics are not shortcuts for performance reviews.
Remember: Surveillance controls people; measurement improves systems.
Publish a "What We Won't Do" list: no leaderboards, no compensation ties, and no surprise changes.
5. Run the Operating Loop
Metrics create value only when they change behavior. If you collect data but don't change decisions, you have built a monitoring system.
The Loop:
- Review: Look at the data (e.g., code review speed).
- Investigate: Why is it slow? (Batch size? Unclear ownership?).
- Act: Change the working agreement or tooling.
- Repeat: See if the metric improves.
Teams that connect metrics to resourcing decisions (staffing, tooling) build credibility because measurement leads to help.
6. Watch for Goodhart's Law
When a measure becomes a target, it stops being a good measure.
The Failure Modes:
- Proxy Worship: Treating PR count as the goal.
- Gaming: Splitting work to inflate numbers.
- Individualization: Drifting from team metrics to individual judgment.
Treat metrics as a multi-signal dashboard where no single number "wins." Require a narrative explanation for changes.
Closing Thoughts
The difference between measurement and surveillance is intent and implementation.
If your metrics help teams understand constraints, they are tools. If they are used to judge individuals without context, they are weapons.
Surveillance controls people; measurement improves systems.
Do This Next: The Metric Safety Checklist
Audit your metrics program against these four items.
- The Charter Check: Have I published a document listing exactly what we will not measure?
- The Aggregation Rule: Are all metrics grouped by Team/System? (If "Person," delete it).
- The Decision Test: For every metric on the dashboard, can I name the specific decision it informs?
- The Access Audit: Is the dashboard visible to the team, or just to management? (Make it visible).