The England Football Team are subject to ‘blame and punish’ as a widespread attitude to error on the football field – but just how helpful is this in assisting any team to achieve better in the future? If it doesn’t work in sports – why do the managers of doctors believe it will work in the clinic? Consultant Psychiatrist Dr Raj Persaud explores the thorny issue of what to do about underperformance on both off and on the substitutes bench.
Why physicians fumble the ball – and should they be shown the red card?
by Dr Raj Persaud, Consultant Psychiatrist
In the field of ‘error analysis’, the opprobrium handed out to Green by the nation follows a well known approach to human fallibility widely referred to as ‘blame and punish’. ‘Blame and punish’ is popular, not just amongst tabloid newspapers, but also characterises the approach to medical error in our Health Service.
‘Blame and punish’ requires that any clinical boob must always be pursued for a scapegoat, sorry, culprit. This is almost always a doctor. The assumptions are that errors are rare, exceptional and always avoidable. Were it not for the feckless, our manager’s rant, we would glory in an all-conquering, fault-free medical team. ‘Blame and punish’ is charged with producing competitive footballers, and doctors who wouldn’t be left on the substitute bench. But ‘blame and punish’ may ultimately be depriving us of better performance. A classic home goal!
A litany of past World Cup soccer blunders along with high rates of clinical error, raise the inconvenient question of whether deploying ‘blame and punish’ is pragmatically advancing our cause. It may cathartically release mob frustrations, but is there any evidence base for ‘blame and punish’ being genuinely effective in the long term elimination, or reduction of clinical or sporting slip-ups? All that it appears to succeed at is driving error underground. Doctors simply get better at burying their mistakes, and footballers’ ‘mea culpa’ post cock-up press conferences get slicker.
Taking the lead from the aviation industry
The British Journal of Medical and Surgical Urology reports that about 10% of hospitalised patients suffer an adverse event as a result of the care that is delivered to them. William McIlvaine, an American anaesthetist, in a paper entitled ‘Human error and its impact on anesthesiology’ claims that medical errors kill more people than automobiles. Reported error amongst healthcare professionals is not some kind of isolated, rare event.
The aviation industry is deeply interested in understanding why well-trained and conscientious professionals continue to relentlessly commit howlers. This is despite all the best efforts of technology, ergonomics and back-up to prevent them. Key Dismukes, Head of Human Factors Research at NASA, points out that other professions, such as medicine, in which error can have calamitous consequences, might have much to learn from the way thinking about human error has evolved within aviation. More resource is devoted to investigating air professionals’ mistakes, than any other field of human endeavour.
In contrast to medicine and sport, airline regulators have long abandoned ‘blame and punish’, and sought instead to locate the source of error as arising out of a complex interplay between factors, in which the professional is but one slipping cog in a set of grinding gears. Accident investigators frequently found themselves clambering over wreckage where it’s difficult enough to recover the black box, far less the crew, in order to castigate them. So they have moved away from personalised blame (it’s pointless when everyone is dead), and focused instead on what can be learnt in order to prevent future error. If you personalise mishap, what have the rest of us to learn from your stupidity?
In the case of Harold Shipman, where hundreds died, along with the responsible ‘pilot’, this kind of medical calamity can begin to look a lot like a major air crash. But do healthcare and football learn from their mistakes, in the way aviation now does, and consider changes which render similar events significantly less likely in the future? For example, we could locate the ‘error’ not in goalkeeper Robert Green’s butter fingers, but bring in a variety of other factors for consideration. The manager had alerted Green he was selected just two hours before the match. Was this playing on his mind? Did this contribute to a lack of focus? Also, would we be obsessing about Green’s performance, if England’s other players had achieved a score line of 3-1, as might be expected against supposedly inferior opposition? That Green even had to collect a ball from a US striker might suggest at least some failings in his defenders?
But when doctors turn out to play, there is frequently no defence provided, which we merrily accept. The managerial class charged with examining mistakes very rarely locates fault within their own ranks. Days after the fumble, more searching questions about the manager’s role arise. Fabio Capello has pointed out that the national team suffers from a diminishing pool of players to choose from, because the English Premiership is increasingly dominated by foreigners. And so the ripples from a momentary mishap spread out to implicate distant parts of the system. If the manager of England has become used to facing tough questions about decisions following a disaster, how come NHS managers relentlessly escape similar probing?
The ‘Swiss cheese’ model
Doctors, in contrast to players, represent a convenient villain, able to distract from wider political inconveniences, such as resource and managerial issues. NHS managers have constantly dodged accountability, in a way that the communities they serve wouldn’t tolerate for their local football team. Doctors might need to develop their passing game. ‘Blame and punish’ distracts from a more nuanced understanding of why clinical errors occur, which would call into question, amongst other matters, the way the systems in which we work are organised and managed. The aviation industry, in contrast, prefers what is described as the ‘Swiss cheese’ model. In a slice of Swiss cheese, while there are many holes, it takes a particular coincidence where a series of apertures lines up, before the unlikely event of an opening through the entire portion occurs. Therefore, it’s not one particular event that leads to catastrophe, but, rather, a series of mishaps that unfortunately fall into place.
A wing flap may become stuck, but the warning signal in the cockpit is misinterpreted, followed by the crew making a dire judgement call, leading to the plane crashing soon after take-off. Aviation accident analysts had to look wider – at what were common occurrences on aircraft that didn’t crash. They found, to their horror, that the kind of apparent crew mistake, which seemed to be such an obvious cause of a particular accident, was in fact often common behaviour on other flight decks. It was just that all the holes didn’t align on most occasions.
Clinical errors, as illuminated under the harsh spotlight of a serious untoward inquiry, can only be properly and fairly understood from a wider understanding of everyday practice. Ascertaining the ‘denominator’ underpinning a particular fate becomes essential. Inquiries into physician mishaps seem to mobilise only when misfortune has been reported, rather than gaining an insight into everyday practice, so they remain hopelessly biased towards ‘blame and punish’.
‘Blame and punish’, furthermore, generates an attitude to performance which isn’t helpful. How can we expect our footballers to really deliver their best, if every touch of the ball, should a fumble occur, mean their children back home are targeted in the playground? Will doctors take the necessary risks in the best interests of patients, if apparent mishaps are going to result in inquiries that last years and threaten careers?
Encouraging an open culture
Finally and most damning of all, errors occur all the time, and only a small proportion are picked up by any detection system. This means that in order to fully understand mishap we need the collaboration of the so called ‘culprits’ to come forward and openly discuss mistakes, which otherwise would have gone completely unnoticed. Error of a catastrophic nature very rarely strikes out of the blue – like a USA goal.
Instead individuals and the system involved have usually and systematically erred in the past in ways approximating to any new big boob, but somehow got away with it. If error was more sympathetically and realistically appreciated by ‘the system’ then we would work in a culture where we could afford to be more honest about our mistakes, so that we might better understand and ameliorate them. It’s unrealistic to expect error to be completely eliminated from medicine, as it is from an England football team.
Accident analysis investigators in the aviation industry now appreciate that the average air crew make several errors on any one flight. These remain minor given the thousand and one other tasks that are done correctly. Errors are not in themselves the problem in medicine, on aircraft, or the World Cup – no matter what ‘blame and punish’ would have you believe. It’s whether we learn enough from our errors so as not to perseverate. Key Dismukes, head of Human Factors Research at NASA, points out that errors would be better understood, not as aberrations, but as products of the very same factors which drive effective performance.
As we witness Robert Green fumbling the ball into the back of the England net, instead of vilifying him, we might consider how lucky we are, that so far, the holes have not lined up for us. The psychoanalysts argue our need to locate ‘badness’ in others is a necessary psychological mechanism by which we cleanse ourselves of contamination. But we are all fallible. And we are better doctors for accepting that and then trying to improve. However, we need help, and the health service currently feels more like a tabloid newspaper reporter when we turn to it for assistance. They may appear outwardly friendly, but they are most keen for a self-incriminating quote, or a snapshot of us caught in a compromising position.
The issue isn’t so much that Robert Green is a hopeless goal keeper. Like most doctors, it’s unlikely he could get to where he is, and be really as terrible as his current appraisal. The key questions are: Can he learn to improve? Can the whole team? And can we support them in that endeavour?
Dr Raj Persaud is a Consultant Psychiatrist and Emeritus Visiting Gresham Professor for Public Understanding of Psychiatry.
Declaration of Interest: My greatest error (to date) led to my being suspended by the GMC for journalistic plagiarism for 3 months ending in October 2008.
- Dismukes RK. Understanding and Analyzing Human Error in Real-World Operations in Human Factors in Aviation (Second Edition) 2010:335-374
- Undre S, Arora S, Sevdalis N. Surgical performance, human error and patient safety in urological surgery. Br J Med & Surg Urology 2009;2:2-10
- McIlvaine WB. Human error and its impact on anaesthesiology. Semin Anesth Periop Med & Pain 2006;25:172-179