It’s customary practice nowadays to make a news story topical by establishing a link to the credit crunch. In this case, however, that’s something I’ll pick up later. Initially my starting point lies elsewhere.
Through December 2008 the UK Parliament’s Public Administration Select Committee (PASC) was collecting evidence for a report it’s preparing on ‘Good Government’. When one of the expert witnesses the Committee had called was asked whether government would ever get to a situation where it genuinely became a learning organisation, and therefore ought to be able to develop and deliver policy more effectively, the witness’s response was that his ‘…experience from talking to people in Whitehall is that it suffers from organisational amnesia, not organisational learning’. He went on to point out that this was not simply a problem of Whitehall, but appeared to afflict other parts of government and public services too. Another expert added that he doubted whether this phenomenon was unique to government and (correctly) cited recent disasters in the banking sector as evidence to support this.
Houses of Parliament.
Even more damning for the Government, perhaps, was the observation, supported by a number of the experts, that despite New Labour’s commitment to evidence-based policy, and guidelines produced in 2000/01 by the Cabinet Office and National Audit Office (NAO) as to how this might be developed,
‘everybody quietly forgot about it, and as far as I’m aware, there has been no attempt to go back and evaluate at all whether or not that has been thoroughly implemented, which I do not think it has.’
Reluctance to evaluate policy, programmes and projects – particularly where IT is a central feature – is a well recognised issue. And a lack of evaluation necessarily creates problems for organisational learning, which further fuel, and probably even encourage, a tendency to organisational amnesia. Consequently, where these three elements combine it’s unsurprising that we end up with projects and programmes, and the policies that underpin them, that lack historical grounding and supporting evidence, whether that’s from the evaluation of previous initiatives or related developments elsewhere.
This situation is of particular relevance where IT is a central feature of an initiative because it creates a catalyst for the well known phenomena of a ‘solution looking for a problem’. Or, to put it another way, it would appear that organisational amnesia allows the vendors of IT systems the opportunity to exploit untested assumptions, bias, ignorance and self interest, amongst other things, to sell inappropriate solutions to real or perceived problems.
Too often this then leads organisations to adopt technological solutions to what are primarily human centred problems, with limited success; replace existing mixed socio-technical systems that have worked well with untried IT systems that subsequently perform worse; or the implementations of IT systems that the vendor claims have a generic usage, or can be easily adapted to different settings, but where this claim is not pre-proven or is deliberately misleading. In addition, each of these scenarios often results in the requirement for ongoing support from vendors or other ‘experts’, usually at additional cost.
It’s at this point that I’ll introduce the credit crunch because of its impact on what appears to be one of the most costly examples of solutions looking for problems – the NHS’s Connecting for Health (CfC). This massive investment in IT/IS – reportedly now costing more than £13 billion – has been well documented elsewhere (see Wikipedia, for example). However, since its inception in 2003 any official recognition of the problems and cost of the programme have always been muted or non existent. As 2008 came to a close things changed with the Chief Executive of the NHS admitting to the Health Select Committee that a number of the core systems within the programme were not really fit for purpose. Consequently, with public funds tight due to the credit crunch (and the recession), some form of rethink of how the programme moves forward might be on the cards.
Given what I’ve said above, I’d suggest that one of the most important elements of any rethink is an independent evaluation of the various projects within CfC. From that we should be able to gain important insights into the causes of the failures and successes of the programme so far, promote learning within the NHS and other organisations associated with CfC, and, hopefully, begin to address the level of organisational amnesia that appears to have afflicted this initiative since its inception. Not to do so would be yet another example of how little interest there really is in evidence based policy making.