Close this search box.

Evaluating evaluation

change evaluation

Why did evaluation get into ITIL?

In 1998 Pfizer’s share price doubled – all because of an unexpected side-effect in a drug being developed to treat heart problems. I was reminded of this when asked on a recent ITIL course to explain why on earth ITIL had Change Evaluation as something apparently separate from change testing, verification and review – they clearly thought that little lot should be enough to base decisions on.

Now most people delivering an ITIL course can avoid that question, simply disassociating themselves from the eccentricities that the ITIL authors have bequeathed them and get the course back on to topics that are examinable, I couldn’t do that for two reasons since

  • A decision that our internal courses need to add real value by deliberately allowing time for discussions to go slightly off the mainstream so we might deliver education as well as examinable knowledge. This is the ‘sounds good if somewhat pretentious’ reason
  • More prosaically, I was co-author of the Service Transition book that introduced Evaluation as a process, so can’t really duck the issue!

Actually, I was happy to discuss our intentions in what Evaluation should talk about, because I think there is an important idea underneath it: a focus on establishing the net actual value that a new or changed service would deliver. Like most ITSM processes that is something we are used to in everyday life, but somehow find difficult to sort out in a work context.

So, let’s think about real life first. Imagine you plan your holiday – two weeks relaxing by the pool in glorious sunshine on the coast, sipping martinis. What actually happens is that the resort is closed by political strife, nothing similar is available and the travel company delivers you instead to a mountain village surrounded by good walking and historic castles with great local wine. And they give you a 25% refund. After you get home you evaluate the value to you of the holiday – you are delighted, enjoyed it immensely, found you love things that you didn’t know you would – like walking and history. You liked wine before, but now you really like some kinds. But judge the holiday against your pre-determined criteria and it fails in every respect. That, to me, is the difference; change is mostly about did we do what we expected? Evaluation is about ‘now we’ve done this, do we want it?’

Service management lesson form a little blue pill?

And so back to Pfizer: judge Viagra against the original intention and it was a disaster, judge it by the actual results and it made the company’s future safe.
What about ITSM and business environments then: how common is it to judge the actual results, good and bad, of a new or changed service. It isn’t always easy to look at what is really happening, when you have spent so long focused on what is supposed to happen. That really means that Evaluation, even more than testing, needs an objective and therefore non-designer/developer perspective. And if we are seeking actual business value then, logically, it requires a business perspective – and one capable of seeing overall cost and value, not just the lower level behaviour of a service.

It probably also needs objective assessment from the Operational side of the house. Of course, these days that might be your house, a supplier’s house or just confirmation that your cloud supplier can make it work – in a house somewhere.

So, it is already getting to that level of complexity and effort that leads many organisations to dismiss it as ‘just too difficult’. Nice theoretical idea but not practical for us because looking for side-effects – good and bad – means a range of observation and involvement that would intimidate many an organisation. We can probably all think of an occasion where a new or changed service was popular, but not with the intended users or for the expected purpose. But the odds are that was not established formally or recorded within the project plan – that tends to stop, unfortunately, at go-live!

But I do think it is worth recognising the value behind the evaluation idea. Capturing serendipity and less happy accidents is a seriously useful input into future design and planning, and into to other things like organisational structure, shared services and much more. Think back to the holiday example, that good experience will influence future holiday planning.

Pragmatism in all things

But like all ITIL processes, maybe we shouldn’t take things too literally or, dare I say, too seriously. Evaluation doesn’t rely on absolute precision to deliver value. Just like the other processes, even with those that sound technical like configuration and capacity, the judgement should be: ‘do they deliver more than they cost’. So … this would also mean evaluating the evaluation process. ☺

Actually, I don’t think the spirit of judging real results needs to be linked to a formal ITSM process, but it is at its most interesting when things don’t go as expected. Viagra isn’t the only good example of unexpected benefits making a real difference. What I love about the principle though is that, almost uniquely, it is an idea that sits above, or outside, the strict parameters of service design – instead it allows reality and pragmatism a space in the planned universe that is theoretical ITIL, or COBIT, ISO20000 etc – a spill over from a more Agile oriented philosophy perhaps, or just a window for common sense and human variation to have its say?

Why do I feel this is so worth writing and thinking about? Well, if we really have a customer oriented attitude (as we should!) then we are likely already to be doing this, formally or informally, as part of change or through SLM and BRM. If the idea of additionally judging results independently of formal expectations feels strange and somehow ‘wrong’ to you then maybe you need to re-examine your degree of customer perspective, because the traditional IT – or more generically, engineering – trap is too easy to fall into. Instead of just working hard to get better at delivering what we think is wanted; look at how things are actually used.

Intuition, not training, rules

These days, people don’t expect to need training, or to have to read instructions before they start using things. That is something that has crept over is in most parts of life. Just as one example, when did rental car companies stop showing you the car and just give you keys and a parking bay number? Time was you needed to know each car now, once you know how to drive, the variations, such as they are, are broadly intuitive. And now that everything is browser-based, most people expect to be able to get on with software and learn as they go.

So, we should expect more innovation in the use of services from people who are less likely than their parents to ask the question: ‘what are you supposed to do’ and expect an answer. Now it is more ‘what happens why you do this?’ This kind of experiential learning is here to stay, and not just with the software and services we use at home, but at work too. That means applying the ideas we learn at home. “Do I actually like this, now I have it and am used to it?” is a question to bring to work too.

Light green canvas Freshworks and Device 42 logos with a friendly robot reflecting the news: Freshwork acquires Device42.
Hands typing on a laptop with digital icons, representing "What is Enterprise Service Management."

Explore our topics