“If I had a Hammer…I’d Hammer out Justice…”*
In this series of blogs I have explored the dimensions of what has become known as the “Just Culture,” or the structural paradigm to assess clinicians’ actions when things go wrong and the corrective steps necessary to prevent recurrences. This final commentary deals with an extreme circumstance of negligence, where opportunities for good outcomes were missed due to professional behaviour well below the acceptable norms for our sacred profession.
A “Just Culture” is all about improving within a context where incidents are reported for the purpose of learning. Hazards are identified proactively to avoid future situations where errors occur and patients are harmed. The role of leadership is to provide the system-wide infrastructure necessary to deliver safe and effective healthcare. When the system itself is the source of failure, then what should be the just outcome?
Just Culture for Reckless Bullies with Sauce Béarnaise and New Potatoes
A distinguished professor, who has greatly influenced me during my career, once told me the reason we call our profession the “practice of medicine” is because we are always practising. In an inclusive sense, the art of medicine requires that we should practise improving our behaviour. Healthcare is enormously challenging, and those who commit to working in this profession need to experience joy and meaning in their work. Collegiality and respect for colleagues and patients should be our modus operandi.
The Urgent Case for Cruise Control
Healthcare professionals are committed to providing safe care. We don’t intend to harm patients. Unfortunately, however, our shortcomings may lead to harm. We are sometimes complacent when we should be “on guard.” Most of us demonstrate at-risk behaviour from time to time. We get lulled into a sense of complacency because usually no one gets hurt, and thus we normalize at-risk behaviour.
Ever exceed the speed limit when you are driving? The fact is that many of us speed every day, but since no one gets hurt and we do not ordinarily get speeding tickets, we normalize this behaviour, even though safety experts have told us it is dangerous and it is against the law. Most of us do not use cruise control on motorways even though we know that cruise control will prevent us from speeding and engaging in at-risk behaviour. Where does at-risk behaviour becomes reckless behaviour? If driving 75 miles per hour on a motorway is at-risk behaviour, when does this behaviour become reckless—80 mph, 85 mph, 90 mph?
Healthcare professionals all demonstrate some at-risk behaviour. We take shortcuts, we use workarounds, we reach conclusions before all the evidence is in and more often than not, no one is harmed. We normalize our deviation and may even note that “everyone does it,” it’s human nature. How should at-risk behaviour be addressed in a Just Culture?
The “Substitution Test” as a Lifeboat for a Perfect Storm
When patients have been harmed, one logical explanation is that someone has done something wrong. Unfortunately, incident investigations have all too often focused on identifying who was responsible, who had a lapse in judgement, who messed up, who can be held accountable, blamed, and… punished. For years this paradigm encumbered the process of learning from incidents. Incidents were not reported, certainly not by clinicians who recognized their own errors or those of colleagues. If the consequence of reporting was likely to be punitive and professionally comprising, then what was the incentive for reporting?
When the patient safety movement was just beginning, a commendable goal was to increase reporting of incidents and near-misses, all within a non-attribution model with a focus on learning. Staff would be encouraged to report because they would not be punished and all could benefit.
Why a “Just Culture” really matters!
The identification and analysis of patient safety incidents is a quintessential component of a robust patient safety program and the culture that sustains such a program. Safety improvements arise from identification of incident-related causes and contributing factors, and front-line staff must be fully engaged in these efforts if we are to improve patient safety.
It is a common perception that physicians (the term is used to include all physicians and surgeons i.e. “doctors”) are the healthcare professionals least likely to report incidents or safety concerns or to be included in incident analysis. As key members of the front-line staff enhancing their involvement should be an important goal in the sustainment of a robust patient safety culture.
It has been said “near misses are the gold dust of patient safety.” If our profession is to become highly reliable, then learning from errors with the potential to cause harm, before they actually cause harm, is the quintessential outcome of striving for high reliability. Institutions that fail to analyze near misses are, at best, on a slower road to high reliability.
In 1848 gold was discovered at Sutter’s Mill, Coloma, California. Gold flakes were found floating in the American River, and beginning in 1849 thousands of people flocked to California in what has become known as the California Gold Rush. The prospectors became know as the “49ers.” Though most Gold Rush “49ers” did not become wealthy, the myth that the discovery of gold would lead to vast riches became entrenched in the collective experience of history. Everyone knew that the dust came from veins within the ground, the gold mines.
The fifth characteristic of high reliability organizations (HROs) is Resilience – leaders and staff responding when systems fail, collaborating to overcome challenges. For HROs, resilience means dealing with emergencies, preventing translation of these mishaps into harm for consumers and instituting corrective actions. How does your hospital measure up?
Case Study: A 2-year old with rhabdomyosarcoma is receiving actinomycin-D and doxorubicin chemotherapy. The child’s liver function studies, though abnormal, were not reviewed prior to ordering doxorubicin. She receives a dose that is three times normal, develops bone marrow suppression and fulminant sepsis. She nearly dies. Her subsequent therapies are delayed, and her prognosis worsened.
The National Advisory Group on the Safety of Patients in England was commissioned to formulate recommendations for improvements in safety arising from problems identified in the Mid Staffordshire Public Inquiry¹ and has just released its report². This report was written by thoughtful people who have long admired and/or worked in the NHS and who recognize the high quality of care that has been, and can continue to be, provided to the citizens of England.
The report identified compelling challenges confronting NHS England, most notably a less than perfect patient centric focus and a bureaucracy so encumbered, such that responsibilities were dispersed to the point that it was unclear who was accountable, and for what. “When responsibility is diffused, it is not clearly owned: with too many in charge, no-one is.”
The fourth characteristic of high reliability organizations is Deference to Expertise – leaders listening to, and seeking advice from, frontline staff who know how processes really work and where risks arise. If you want to understand how the machine works you should ask the nuts and bolts and gears because without them the larger bits and pieces can’t work efficiently and will predictably fail…, repeatedly!
Standard wire diagrams, with boxes and connecting wires, are often used to portray the relationships of authority and responsibility in hospitals. Typically missing from these diagrams are the people who sit in the White Spaces beneath the labelled boxes or adjacent to the wires, the front line people who are continually crossing and communicating, those who provide the most pragmatic aspects of work in the hospital; and the people who touch the patients and hold the hands of family members.