top of page

Measuring What Matters: The Design Team Performance Index (DTPI)

Updated: Jul 30


Context


Over the years, I’ve sat in countless conversations about “high-performing” design teams.


But first—what exactly do we mean by performance in a design team?


It’s not just about how fast a team delivers screens or prototypes. It’s about how effectively design contributes to solving real problems, aligning with business goals, supporting cross-functional collaboration, and ultimately creating value for users.


Performance, in this context, is the combination of speed, quality, clarity, creativity, and impact. It’s not only about what gets done, but how it gets done—and how it’s received by the people it’s meant to serve.


Measuring design team performance isn’t about reducing creativity to numbers—it’s about creating alignment, enabling reflection, and building the kind of clarity that lets great teams thrive.


Performance matters, but performance without trust leads to toxicity. The best teams are not the ones with the highest individual performers—they're the ones with the highest levels of trust. - Simon Sinek

This reminds us that design success is rarely the result of solo brilliance. It’s the outcome of how individuals work together—how they listen, align, and build trust across roles. That’s why team performance—not just individual output—deserves to be measured, nurtured, and celebrated. Everyone agrees they’re important—but ask what high-performing actually means, and the answers start to blur. We hear things like, “The team feels strong,” or “Stakeholders are happy.”


That’s great—but it’s also vague.


We rely too often on gut feel, vibes, and loosely-held impressions. The truth is, we don’t really measure design team performance in a consistent or objective way. And that’s a problem—because, as the saying goes: “You can’t improve what you can’t measure.”


So I started wondering: Can we be more objective? More quantitative? Without stripping away the creativity, humanity, and intuition that make design work so special—can we also make its performance more visible?


That question led me to a simple idea: the Design Team Performance Index (DTPI). It’s a practical, customizable scoring model that helps track design team performance across five areas that truly matter.





The Five Pillars of Performance


Five Pillars of Performance
Five Pillars of Performance

From research, conversations, and real-world practice, these five dimensions consistently emerged as both measurable and meaningful indicators of what makes design teams effective, respected, and impactful.

  1. EfficiencyHow fast and effectively does the team deliver? Design teams operate within broader product timelines, and speed without burnout is essential. According to the Design Management Institute, top-performing design-led companies outperform the S&P Index by 219% over 10 years—execution speed is a key factor.

  2. Design Quality Is the output usable, clear, and well-crafted? Good design is invisible, but bad design is painfully obvious. Nielsen Norman Group notes that usability is the cornerstone of good UX—and metrics like task success and error rates are valid proxies for it.

  3. Stakeholder SatisfactionDo your cross-functional partners trust the design team? Great design work that lacks stakeholder support rarely sees the light of day. Adobe's State of Create report found that 73% of companies that foster collaboration and trust in design see higher revenue growth.

  4. InnovationIs the team pushing boundaries, testing new ideas, and evolving its craft? IDEO, Frog, and other leading design firms have long pointed out that innovation is not a luxury but a discipline. It's measured not just in “big ideas” but in a culture of experimentation and learning.

  5. Business ImpactDoes the work move the needle on key product or business outcomes? McKinsey’s Business Value of Design report showed that companies with strong design practices saw 32% more revenue and 56% higher shareholder returns—highlighting the direct link between design excellence and business success.


These five pillars serve as a balanced scorecard for evaluating not just output—but value, alignment, and growth. Together, they form the foundation of the DTPI.


Each of these is scored from 0 to 10, then combined into a weighted average that gives us a single number: your DTPI score.


It's not meant to be perfect or rigid—it’s meant to start a better conversation.





How to Use the DTPI – A Step-by-Step Guide


Step 1: Set Your Intention

Clarify why you're measuring—whether to track progress, identify gaps, or demonstrate impact. Are you trying to measure improvement? Spot gaps? Show impact? Track the effects of new tools like AI? Be clear about the “why.”





Step 2: Choose Your Metrics

Pick 1–2 relevant, quantifiable indicators for each performance dimension. Pick 1–2 metrics per category. For example:

  • Efficiency – Design throughput, time to delivery

  • Design Quality – SUS score, task success rate

  • Stakeholder Satisfaction – Post-project ratings from PMs/devs

  • Innovation – Number of concepts tested, design-led experiments

  • Business Impact – Conversion lift, retention change





Step 3: Gather the Data

Pull data from tools your team already uses—PM tools, usability tests, surveys, and analytics. Chances are, you’re already collecting some of it. Use tools like:

  • Jira, Linear (efficiency)

  • Maze, Hotjar, surveys (quality)

  • Typeform, Google Forms (satisfaction)

  • Notion, Figma (innovation logs)

  • Analytics tools (business impact)





Step 4: Normalize Scores (0–10 scale)

Convert each metric into a score between 0 and 10 using consistent benchmarks.

Normalization helps make different metrics comparable on a common scale. But it's important to revisit these scales periodically. Example:

  • SUS Score: 90+ = 10, 80 = 8, 70 = 6, etc.

  • Throughput: 12+ projects/month = 10, 10 = 8, and so on


Be transparent about how you assign scores.





Step 5: Assign Weights

Decide how much each dimension contributes to the overall score based on team or org priorities. Customize this to your team’s focus. For instance:

Efficiency 25%

Quality 25%

Satisfaction 20%

Innovation 15%

Impact 15%

ree



Step 6: Calculate the DTPI Score

Multiply each score by its weight and sum the results to get a single, trackable score out of 10. Here’s how it might look:

Dimension

Actual Metric

Normalized Score (0–10)

Weight

Weighted Score

Efficiency

Throughput: 11 designs/month

8

0.25

2.00

Design Quality

SUS Score: 78

7

0.25

1.75

Stakeholder Satisfaction

Stakeholder survey avg: 3.8/5

6

0.20

1.20

Innovation

2 new concepts tested

5

0.15

0.75

Business Impact

Conversion uplift: 4%

6

0.15

0.90

Total DTPI Score




6.60 / 10

ree




Step 7: Track Over Time

Record scores monthly or quarterly to observe trends and measure the impact of changes.

Create a dashboard using Google Sheets, Notion, Airtable, or Power BI. Here's an example of what a tracking sheet might look like over time:

Month

Efficiency

Quality

Satisfaction

Innovation

Impact

Total DTPI

Q1 2025

7

6

6

4

5

5.90

Q2 2025

8

7

6

5

5

6.35

Q3 2025

8

8

7

6

6

7.15

Q4 2025

7

7

7

5

7

6.75

Graph tracking DTPI over quarters
Graph tracking DTPI over quarters




What to Revisit—and Why

To keep the DTPI relevant and useful, revisit the following elements regularly:

  • Step 2: Metric Selection – As your team’s focus shifts or new tools surface, some metrics may become less meaningful. For example, a metric useful during early-stage work might not apply to scaled or mature teams. Replace, update, or refine metrics to reflect what matters most now.

  • Step 3: Data Sources & Collection – Teams may adopt new tools, automate processes, or uncover more reliable data channels. Update your sources and ensure you’re gathering data in the easiest, most accurate way possible.

  • Step 4: Normalization Rules – What counts as a “10” in one quarter might become the new average in the next. As your team grows or improves its throughput or quality, reset benchmarks to ensure scores remain meaningful and motivating.

  • Step 5: Weighting – Organizational priorities change. If the focus shifts from speed to innovation, or from output to business impact, update the weights to reflect those shifts.


Revisit these quarterly or biannually, or anytime your team experiences a significant change in tools, org structure, or strategic priorities. Regular tuning ensures your performance model remains aligned with both where your team is—and where it's going.





A Trailing, Not Leading Metric


The DTPI isn’t trying to make design mechanical. It's not about spreadsheets for the sake of control. It’s about visibility. Alignment. Having a common language to talk about performance without losing the nuance of design craft.

It helps us:

  • Spot what's working and what’s not

  • Justify where we invest

  • Give designers clarity and feedback

  • Track changes when tools or processes evolve





Let’s Build It Together


This is a starting point, not a final answer. I’d love to know: How do you measure design team performance? What’s worked in your context? What would you add or tweak? Please share in the comments.


Let’s build something better—together.











References and Credits



Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating

ⓒ Rajib Ghosh. 2024 - 2025. All rights reserved.

bottom of page