View this email in your browser

Forecasting and Metrics Newsletter by Troy Magennis
Six Dimensions of Team Performance

In this newsletter:

  1. Coming workshops and events you might be interested in.
  2. Article:Six Team Performnce Dimensions 
  3. Myth of the Month: Our data isn't clean enough
  4. Tool of the Month: Team Dashboard spreadsheet
  5. About Focused Objective and Troy Magennis

Thanks for subscribing, and if you haven't yet Subscribe here
If you like this content, please kindly forward it to a colleague. 
If you didn't like this content, please email me why: Troy Magennis.

Got Metric or Forecasting Questions? Contact Me

Coming Workshops and Events

San Francisco Forecasting + Metrics 6-7th February
Atlanta Forecasting + Metrics 25-26th February
Atlanta Flight Levels Architecture March 12th
Toronto Forecasting + Metrics 26-27th March
Earlybird (45 days) and Group discounts available
Free Months Call on Metrics
Training Schedule
Free zoom Call: Ask Me Anything on Metrics & Forecasting
All Upcoming Workshops on Agile Forecasting, and Metrics
In-house private workshops popular - your premises - your data - our expertise

Article: Six Dimensions of Team Performance

Just increase team velocity. Just do ten times the work in half the time. These things alone won't guarantee success. Organizations need more than just "more, faster" to succeed. They need the right things; they need these things to work; they need to know they can continue to do those things. In short, it isn't as easy as measuring "more." 

Many problems occur because we view "good" or "bad" using a single measure. The real world is more complex, requiring people to solve dilemmas. The problem with individual performance metrics is that they obscure the price paid by improving that one measure. We need to help people see these hidden costs. 

This article lists the six dimensions that compete with each other in flow-based systems that aim to deliver the highest value to customers. For each measurement dimension, you need to challenge teams to not only improve one metric but look at the warning signs of improving that metric too fast or too much. You can focus too much and too little on each measurement dimension, its about balance. For each dimension, I've listed some starting points of over and under focus in a system when pushing to improve this metric. 

My tip to getting a balanced performance dashboard is to find one measure in each dimension. Don't worry about the perfect metric; worry about getting one metric in each even if it's not perfectly captured or clean.

"Do Lots" - How Much/Many?

Measure how much raw work product is flowing. In a perfect world, not just in development but to customers. This measure isn't about customer value; it is evidence that the system is moving items (has flow). This measure is also useful for forecasting future delivery when in balance with the other five dimensions.

Too little focus or capability

  • Dis-satisfied customers or stakeholders not getting what they need. 
  • Demand > supply, but you don't know it.

Too much focus

  • Declining quality causing defects and re-dos (not "really" done yet)
  • Less valuable "easy" features delivered rather than most needed

Typical metrics in this category

  • Throughput - count of items or tickets per day/week/sprint
  • Velocity - sum story points per sprint


"Do it Fast" - How fast?

Respond and deliver things quickly, given its complexity and novelness. The easiest way to improve this measure is to finish something in-progress before starting something else.


Too little focus or capability

  • Customers frustrated in how long it takes to get changes

Too much focus

  • Declining quality causing defects and re-dos (not "really" done yet)
  • Less valuable "easy" features delivered rather than most needed

Typical metrics in this category

  • Time in State - the time an item was within a "state," for example, "In Development."
  • Cycle time - the time from start to finish at some boundaries in your system
  • Lead time - the time from some commitment to delivery (to the person committed too)


"Do it Predictably" - How consistent is the delivery of value?

Delivery occurs at a consistent pace rather than huge feasts or famine of delivered value to the customers; for example, the variance of pace "Do Lots." This dimension helps see shorter-term process instability (the sustainability metric measures longer-term system stability; it's coming up soon).

Too little focus or capability

  • Periods of progress and others of lower value to customers.

Too much focus

  • Less risky "known" features delivered rather than most valuable or needed
  • Little incentive to push process improvement in case they cause a temporary decline

Typical metrics in this category

  • Variability of throughput of velocity
  • Variability of the delivered customer value
  • Net Process Flow: Things Delivered - Things Started. This measure shows balance through the system with variability represented as a higher or lower peak, with the desired state hovering around zero.

Tip: For a variability measure, consider using the Coefficient of variability: Standard Deviation / Mean rather than the Standard Deviation alone (higher values naturally have a higher Standard Deviation for the same percentage change. Dividing by the mean normalizes that.


"Do it Well" - How good was the quality versus expectations?

A measure of how well the delivery of things that solve a problem or need. Often this measure is called Quality and is one of the hardest measures to get a handle on. The goal isn't purely quality; it serves as an early warning sign that a system is being pushed to deliver beyond its capability. 

Too little focus or capability

  • Rework. What is delivered needs to be corrected
  • Customer dissatisfaction. 
  • Production issues. 

Too much focus

  • Little or no delivery of value or flow of items due to "just a little more testing."
  • Slow feedback if the wrong thing is built (albeit perfectly functioning)

Typical metrics in this category

  • Escaped defects. Defects found outside of the development and delivery team
  • Customer satisfaction. Customers don't like what you built and tell you
  • Production rollbacks. Second and third releases to get a stable, working system
  • Unplanned downtime. Issues in production outside of planned change windows


"Do Valuable Stuff" - How valuable was it to the customer?

A measure of how much value customers derive from released features or projects. The goal isn't purely customer value; it serves as an early warning sign that a system is being pushed to focus on work output rather than an outcome. 

Too little focus or capability

  • Rework. What is delivered needs to be revisited to deliver "more" of this feature
  • Customer dissatisfaction. Internal feeling that work is flowing well, but the customers aren't feeling the value.

Too much focus

  • Increasing technical debt. Teams consistently skip technical debt reduction items for supposedly higher value items.
  • Lack of prioritization for strategic work that is mid to longer-term (current customers happy, but declining entry into new markets or targets).

Typical metrics in this category

  • Cost of delay. An economic view of the cost of NOT doing work to the customer and organization.
  • Alignment to strategy. Prioritized work allocation matches a planned strategic allocation
  • Customer satisfaction. Customer feedback confirms what was delivered solved a problem with high satisfaction.


"Keep Doing It" - How sustainable is the delivery system (and people)?

A measure of how likely the current performance of the development and delivery system can continue in the future. Often called the "happiness" metric, but it's more important than that label describes. When teams push hard on the improvement of the other metrics, it sometimes takes a toll causing a decline in the future. The goal of this metric is to be an early warning indicator of that gloomy performance in the future.

Too little focus or capability

  • The current performance measures aren't maintained.
  • The collapse of delivery.

Too much focus

  • Stagnate performance improvement over time. The other metrics stay flat.

Typical metrics in this category

  • Team health via survey or team retrospective (honest answer to "are we able to continue at this pace?" 
  • The aggregate of the other performance metrics (D1 to D5) metrics

History and credit:

Larry Maccherone came up with the first four dimensions in his Software Development Performance Index (SDPI). He combined measures of Productivity, Predictability, Quality, and Performance to assess different Agile folklore (best team size, sprint length, co-location as examples) against 10,000 projects for his employer Rally Software with help from Carnegie Mellon's SEI. He noted that it was possible to do all four and suggested the addition of the Happiness metric, which is coded here as Sustainability. I added the value metric based on the observation we could deliver lots of un-useful stuff, and wanted a balance against just increasing velocity or throughput for no real customer impact (I'm sure others also noted that omission). I also took liberties, renaming them after teaching them in training. I think the Agile community owes a massive debt of gratitude to Larry's work, I certainly do.

Myth of the Month - "Our data isn't clean enough"

"Our data isn't clean enough" - I hear this a lot when I set up a dashboard initially inside organizations. Its generally the case that the data is fine, but the psychological safety is degraded within the organization. I reword it, "your worried it might show a decline where there isn't any?" And they are. They can describe how data has been used to chastise them in the past. Here is my response -

1. I make the data anonymous (no team or individual names)
2. I remove the data and just leave the trend-lines. They quickly see that most teams follow a similar trend
3. I mention that until data is shown, there is little incentive for anyone to cleanly capture that data
4. I let them see that most problems are system problems and even impure data trended highlights what issues EVERYONE is facing.

Tool of the Month - Team Dashboard

From just Start Date, Completed Date and Type data, produces the four flow metrics and leaves space for you to create a chart for Value and Sustainability. Spreadsheet with NO macros, all plain Excel formulas.
Download the Team Dashboard Spreadsheet (it's free)
See all of the free tools and resources

About Focused Objective and Troy Magennis
I offer training and consulting on Forecasting and Metrics related to Agile planning. Come along to a training workshop or schedule a call to discuss how a little bit of mathematical and data magic might improve your product delivery flow.
See all of my workshops and free tools on the Focused Objective website.

Got Metric or Forecasting Questions? Contact Me
Copyright © 2020 Focused Objective LLC, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.