Are your metrics and reporting telling you what you need to know, or what you want to hear?
Not long ago, as I was reviewing another Excel file containing row after row of “KPIs”, the playlist I had running in the background kicked off a personal favorite, which opens with the words “Lie to me, and tell me everything is alright….”
I found myself nodding as it occurred to me how appropriate these lyrics from Jonny Lang’s platinum blues piece Lie to Me were when applied to some of the items I was reviewing. I knew the time would soon arrive when I would be discussing them with those who created them, explaining why some of these definitions and calculations were not going to provide (or were not providing) what they were intended to. I knew exactly how to approach that conversation however, since it is one I have had following virtually every such list I have ever reviewed, whether still being developed or in production use for years.
Some of you are likely thinking “That can’t be right…how could that not be noticed?” – But this actually is often part of the problem. It isn’t that these inaccurate metric or reporting elements are not noticed, but rather that they are not recognized as being inaccurate – which are not the same things at all.
The Wheels On the Bus…
By analogy, consider the tires on your vehicle. You notice them every day; they are right there when you approach the vehicle after all. This observation does not however allow you to recognize if one has a design defect, or is low on air pressure (even dangerously low, with modern tires); you simply cannot always tell just by looking at them. You can even travel a long way with an unsafe tire and not be aware it is less safe or effective than it should be – until it fails. As a result, many vehicles of course now include sensors that alert us to such conditions as low tire pressure; but what if those sensors stop functioning, or – as bad or worse – are calibrated with improper thresholds, providing information that is comforting but not correct? Just because “the wheels on the bus go round and round” does not mean they are safe.
These same pitfalls are also true of metrics and reports. Even with the best of intentions, they can be designed with errors, owing to items such as a lack of clarity regarding actual requirements or data sources. They can produce conclusions that seem reasonable, but in fact are not representative of actual conditions, for reasons such as selecting the wrong data to measure or using inappropriate tools when measuring. They can be set with thresholds that are too high or too low when ‘baselines’ are identified because of something like a number in an SLA rather than by actual measurement, capability and requirement. Another often overlooked aspect is the need for regular adjustments to account for changing conditions and requirements (over time you have to put air in the tires, right?).
Notice I said metrics and reports – this is because I have often encountered metrics that are accurate turned into inaccurate information because of the way a report collects and processes the data being measured. This of course means that your reports have to be designed and maintained with the same care as your measurements.
Anyone can hit a target…
By this time you are hopefully considering taking a fresh look at whatever metrics and reports you are designing, using or relying on – which is of course the point.
In the business space, “metrics” is really a generic label; what we are actually referring to most often have the popular label of Performance Metrics, or sometimes statistics, but the first term is too long and the second too scary. Regardless of which of these terms you use however they all encompass the same topic: measurement.
Obvious? One might think so. But time after time I have found that in practical discussion many people actually do not think of a metric first (if at all) as a measurement, but as a target – a number they have to “hit” or bad things happen. The problem with thinking this way however is the fact that given the right conditions (or wrong measurements), anyone can hit (or miss) a target. Failing to recognize this is a main contributor to many of the mistakes I see made in the use of metrics.
If you have ever engaged in any training regarding measurement, you were likely introduced to the terms ‘precision’ and ‘accuracy’; these concepts are often a part of such training because so many people do not recognize the difference between the two. I will not touch on that here as a wealth of material regarding this already exists – and if you have not seen it I encourage you to search “accurate vs. precise” or similar, as it is valuable to understand – but it only emphasizes the tendency to view metrics as targets, and so it is not enough. Precision and accuracy are results; what I want you to consider is whether those results represent the truth or a (hopefully unintentional) untruth.
It has been a long time since I saw GIGO mentioned, but in the distant past (when I was first getting into IT) “Garbage In – Garbage Out” was omnipresent in IT discussions. That I have not seen or heard it mentioned for so long seems to support my belief that people have forgotten it is still as relevant now as it ever was.
I cannot tell you how many times I have reviewed a metric-supported report and found that while it appeared “accurate and precise” it only appeared so because the target was in the wrong position, or was larger or smaller than it was supposed to be, or that it was hit using the wrong type of “ammunition” (data / tool). This is why you must ensure your input – the data you use, the source of that data, how you process that data are verified to be providing reality, not simply expectations.
Metrics (like statistics) are only as good as the data used to produce them – regardless of how you choose to collate them. That is as good a way as I can say ‘the needs of your business and your customers should guide you to the correct metrics to utilize – but no metric will help you if your data is not accurate or is not the proper data to produce the view you require’.
On ‘paper’, mainly virtual now to be sure, many organizations can appear to be functioning very well – or conversely, very poorly – when in fact the opposite may be true, simply because the data used to produce the measurements the organization relies on is incomplete, faulty, invalid for the intended purpose, or processed incorrectly. Every organization should take the time to review – on a regular basis – what they are measuring and how they are measuring it. Reality can be good, bad, or ugly – but it is what you need if you (and your customers) are to be properly informed.
Mark Twain is famously credited with the phrase “Lies, damn lies, and statistics”. It is good to consider that he could as well have said “and metrics / KPIs”….
Points to remember
I will close by offering a few items to keep in mind whenever metrics or reporting become a topic for you:
- Don’t assume you are looking at reality – validate it is real
- Don’t assume anyone can produce a good metric just because you present them with a ‘good’ KPI
- For every output you are concerned with, know every input that provides it
- GIGO is always true (except maybe in recycling)
- A poor report can destroy a great metric
- Don’t be content – change occurs every day, and you need to regularly ensure everything you are measuring is still being measured as it should be (or if it should still be measured at all)
It is all too easy to accept what we want to hear as reality; in business (and in general), you must not allow this tendency to bite you. In the end, it is up to you to ensure your organization is not unintentionally saying “Lie to me”. I am not saying that there are not times when we would rather hear a little lie to make us feel better – just that it should not happen in your metrics and reporting.