A Cautionary Tale
Years ago I worked for an established fortune 500 company. At the beginning of each year, executives set goals by which they would be evaluated. These objectives were numerical and SMART – Specific, Measurable, Achievable, Relevant, and Time-Boxed. In order to make sure that the person didn’t do long-term damage to meet a specific goal, the company required at least two goals, or a “balanced score card.” They were also somewhat enlightened in that, within ethical limits, how you met the goal mattered much less than weather or not the goal was met. The company was split into independent sub-units, each with it’s own profit and loss statement – a wonderful source of hard metrics.
If you think about it, the way that company was run, metrics and all, is very similar to one particular point of view for IS Management. The argument goes that if we could only take those ideas and adapt them to software engineering, all would be salt and light.
To answer that argument, I would like to tell you a story.
In the late 1990’s, one of those independent units had a Vice President of Sales, who I will call Joe. Well-deserving of the job, the man was seriously brilliant. Taking over from the last sales VP, Joe reorganized the way sales was done, focusing on selling things which cost less to produce and sold for more money. He also expanded the client base, selling into markets that could see more value in the product or with larger purse strings – thus making sales easier.
By the tenth month of the fiscal year, Joe’s sales team had booked more profit than the “exceeds expectations” goal set at the beginning of the year. Why, with the incentives complete and no further profit possible, the team would be just as well off to take two months off work and start again in January, right?
Well, of course not. The company could always use more money, and besides, Joe was measured with a balanced score card. Because he sold more profitable items, he had met his profit target but not his gross sales target. Unless he could hit the gross sales goal, there would be no bonus and no big raise. Yet, late in the year, most customers had already spent the available budget; what remained was very little.
Dilemma. What’s an ambitious business genius to do?
You probably guessed it – Joe had his team go back to the old products and sell them at a loss in order to hit the gross sales goal. This is dysfunction; having all the metrics right but missing the point. As the anonymous philosopher once said “Be careful what you measure, because you are going to get it.”
In this story we have an established, mature organization trying to metrics right and do the right metrics. They used established accounting and business administration principles, which, compared to software engineering, seem wise and established. Why, Mark Twain once remarked that there are three good ways to mislead: Lies, Damn Lies, and Statistics. If metrics dysfunction can find it’s way into a mature field like business administration, we must realize it is a very real risk for software engineering.
Playing pick a number at the beginning of the year and “managing to the numbers” may be easy, but that doesn’t make it right. Numbers can provide information or evidence to help lead to a conclusion, but without the context, we’re likely to make a mistake. We might either abuse the metrics - like the example above, or misinterpret them - such as the new defects trend line that seems to shrink right around spring break.
Twenty years ago, Tom DeMarco wrote that “You can’t control what you can not measure.” Jorge Amanda points out on his blog1 that without measurements it would be very hard to control weight loss, blood pressure, or cholesterol. Yet he also begs this question:
When was the last time you measured the length of your hair? So how can you control the length of your hair?
Now, take that observation and general systems thinking and apply it to software, and I’ve got two wonderful words that describe it:
Management and Leadership.
References:
1. http://catenary.wordpress.com/2007/01/11/controlling-what-you-cant-measure/
Schedule and Events
March 26-29, 2012, Software Test Professionals Conference, New Orleans
July, 14-15, 2012 - Test Coach Camp, San Jose, California
July, 16-18, 2012 - Conference for the Association for Software Testing (CAST 2012), San Jose, California
August 2012+ - At Liberty; available. Contact me by email: Matt.Heusser@gmail.com
Thursday, June 21, 2007
Subscribe to:
Post Comments (Atom)
1 comment:
Awesome!
Post a Comment