Patient safety suffers from a lack of transparency with our incident reporting systems.
In healthcare front line staff can’t access valuable information in their error databases. This lack of transparency leaves us flying blind. Without seeing our errors we can’t improve. The repercussions of this are dreadful for patients, families and front line staff.
Recently we came across a fantastic database – SkyBrary (see here) – it’s a central repository of information about aviation errors. These errors don’t stop people flying, and aren’t used as a source of individual blame. Instead they’re viewed as learning opportunities enabling continual enhancements in aviation safety.
The inability for front line staff to see their errors is perhaps the greatest impediment to patient safety.
From Black Box Thinking by Matthew Syed:
Lack of transparency reinforces a culture of blame – ‘if errors are too bad to reveal then someone must have done something wrong’.
It decreases the likelihood of error reporting – front line staff rarely see effective change.
Staff will be more accepting of proposed solutions when they’re aware of the frequency and severity of these events.
In Australia there are numerous separate incident reporting systems. Most staff are aware of their states public hospital reporting system:
Perhaps the most important is IRIS which belongs to the TGA. It’s important because the most effective way to introduce safety is through a ‘forcing function‘ – basically engineering out a hazard. Often this means refining the medical devices we work with.
Unfortunately most healthcare staff are completely unaware of IRIS. Also, worryingly, none of the other incident reporting systems feed into it. In fact all the reporting systems remain disconnected from one other – this profoundly restricts our ability to identify error clusters.
Currently our error reports are primarily used for crude statistical feedback. Governing bodies with privileged access to the reports may occasionally develop an alert, policy or education framework – these processes rarely translate into front line safety improvements.
While effective safety solutions may appear simple, introducing and mandating them in the workplace is not (see – central lines with moulded valves, distinct chlorhexidine, beveled Draeger APL valves). It requires persistence, stamina, discipline, and perhaps courage to challenge the status quo. Governing bodies may be too far removed from the front line or lack the emotional drive required – patients don’t die at their hands, and they won’t suffer the blame.
Front line staff working with human factors experts are best placed to conceive effective safety solutions. We need access to de-identified error reports for this to happen.
Recently we learnt of a patients death from an air embolus – air had entered their intravenous fluid bag when it was disconnected, then it was re-spiked. We have no idea how often similar events have been reported – how could we. The healthcare response has been education and a policy prohibiting IV fluid bag re-spiking. While this is advisable, predictably re-spiking will continue as will the air emboli deaths – the next staff member involved will either cover up their error or face the music – everyone loses.
IV fluid bags which don’t entrain air on disconnection already exist and are used in many institutions without problem. If we had access to error reports we’d have the evidence to implement this more effective solution.
We’re licensed to look after the ill – it’s time to stop flying blind.
Image (altered) by David Gamble
We’re working with healthcare bodies to allow access to de-identified error reports.