When adverse events occur our first instinct is to discover who or what is to blame. Staff may be remonstrated and potentially removed, in itself warning others not to make similar mistakes. Governing authorities may review related policies to prevent such errors in the future.
This has been the response to the Bankstown gas pipeline tragedy (see here). Taken in isolation the recommendations made through the Root Cause Analysis appear reasonable. The RCA team approach these cases searching for ways the errors could have been prevented.
Unfortunately this retrospective, presumptive approach does disservice to issue complexity. Further, shortcomings in reporting systems make the Bankstown tragedy seem extremely rare, almost a one off event. The recommended solutions reflect this – ‘if we could only get rid of a few bad apples we could stop this happening again’.
Adverse events are displaced in time and place, however managed in this way, they recur with relentless frequency. Accidental pipeline crossovers are not uncommon and are well reported in the literature. (please click here)
While it’s essential that pipeline gas is appropriately analysed post installation several cases would not be identified by a single post installation assessment. (Please click here to review an extensive list)
‘When faced with a human error problem, you may be tempted to ask ‘Why didn’t they watch out better? How could they not have noticed?’ You think you can solve your human error problem by telling people to be more careful, by reprimanding the miscreants, by issuing a new rule or procedure. These are all expressions of ‘The Bad Apple Theory’, where you believe your system is basically safe if it were not for those few unreliable people in it. This old view of human error is increasingly outdated and will lead you nowhere.’ Sidney Dekker
Accidental pipeline mix ups will continue even after the RCA recommendations are adopted.
We’re at an impasse.
We can persist with a name, blame, and train culture, however patients will be left at unnecessary risk.
Alternatively we can choose to learn from our mistakes.
We need to acknowledge the complexity of our work environments and realise it is ‘Human to Err‘.
The challenge is to accept these errors will continue to occur and yet prevent them from leading to adverse outcomes.
‘While most of us assume that “errors = accidents”, aviation knows that “errors – safeguards = accidents”. By obsessing over error reduction (via root-cause analysis), we miss opportunities to increase safeguards. Nick Argall
Ultimately we’re accountable for our own actions, however where work environments can be effectively improved, in the interest of patient safety, they must be. This is a ‘learn’ culture.
Effective solutions minimise impact on front line staff (see here)
If the neonates at Bankstown Hospital were resuscitated via an anaesthetic machine it would have alarmed indicating a hypoxic (lacking oxygen) gas. All anaesthetic machines have analysers which continuously assess gas delivery.
If gas analysers were mandated on neonatal resuscitators the staff at Bankstown would have known oxygen wasn’t being delivered. Introducing gas analysers to neonatal resuscitation trolleys may be costly but is insignificant compared to liability payments for similar pipeline deaths. (US example year 1992 alone: US$7,440,000)
Australia has a tremendous healthcare system developing numerous safety innovations (see here). We’re well placed to generate a ‘learn’ culture and pioneer a safety system to be emulated by the rest of the world.
Let’s not miss out on this opportunity…..
For a fantastic insight into how a human factors approach could greatly benefit healthcare safety please read Black Box Thinking by Matthew Syed.