By Dr Adam Hill, Specialist Anaesthetist (FANZCA) and Instrument-Rated Pilot, NSW, Australia
A pilot files an airspace incursion report. They crossed into controlled airspace without clearance, a genuine safety event. The response from Air Services Australia is a phone call. Not an accusation. A conversation. What happened, what were the contributing factors, what could the system do differently to prevent it. No sanction. No record that follows the pilot's career. The information feeds into a database that makes the system safer for everyone.
A doctor reports an adverse event. The response is an investigation. Sometimes a formal complaint. Sometimes a tribunal hearing that takes years. The information does not feed quietly into a learning system. It feeds into a process designed to determine fault and assign consequences. The doctor's name may end up in a public finding.
These two systems exist in parallel, and they produce entirely predictable results.
Aviation learned this lesson decades ago
In the 1970s and 1980s, aviation was where medicine is now. Accidents were attributed to "pilot error" and the response was punishment or retraining of the individual. The problem was that this approach did not reduce accident rates. Pilots stopped reporting near misses because reporting meant blame. The system lost its ability to see the problems building before they became catastrophes.
James Reason, the British psychologist who died in 2025, changed this. His Swiss cheese model of accident causation showed that disasters do not result from a single individual's failure. They occur when multiple layers of defence, each with their own gaps, happen to align. A fatigued crew, a confusing checklist, a poorly designed instrument panel, an unusual weather pattern. Remove any one of those layers and the accident does not happen. But the old model only looked at the person holding the controls at the moment things went wrong.
Aviation adopted this thinking and built systems around it. Confidential incident reporting. Flight data monitoring. Line operations safety audits that found, on average, two threats and two errors per routine flight. Not because pilots are careless. Because humans operating in complex systems make errors. The question is whether the system catches them or whether it waits for a catastrophe and then looks for someone to blame.
Australia, as it happens, was the first country in the world to formally apply Reason's model in accident investigation.
Medicine's approach is the opposite
The medical profession runs on individual accountability. When something goes wrong, the system asks: who was responsible? The answer is almost always a person, not a process. Helmreich, writing in the BMJ in 2000, found that 30 per cent of doctors and nurses working in intensive care units denied ever committing an error. Not that they rarely made errors. That they never had.
That number is not a reflection of exceptional competence. It is a reflection of a culture where admitting error is professionally dangerous. And it produces exactly the outcome you would expect: underreporting of adverse events is estimated at 50 to 96 per cent in healthcare settings. The system cannot learn from events it does not know about.
The Institute of Medicine estimated that medical errors cause between 44,000 and 98,000 deaths per year in the United States alone. Those numbers are now more than two decades old, and subsequent research has suggested the true figure is significantly higher. The response to this has been more regulation, more mandatory reporting, more individual accountability. Not less.
The problem is not bad doctors
Aviation understood something that medicine still resists: the problem is almost never a bad individual. It is a system that allows predictable human limitations to produce harm. Fatigue, distraction, cognitive overload, poor communication, ambiguous protocols. These are not character defects. They are features of how humans function under pressure, and they are manageable if the system is designed to manage them.
In operating theatres, the closest equivalent to aviation's cockpit resource management is the surgical safety checklist. It works. The evidence is clear. But its adoption has been inconsistent, and in many hospitals it is treated as a bureaucratic formality rather than a genuine safety intervention. The checklist is filled in. The culture behind it has not changed.
Anaesthetists are perhaps the medical specialty closest to understanding aviation's approach. The parallels between the cockpit and the operating theatre are obvious: monitoring systems, checklists, managing emergencies in real time, operating in an environment where small errors compound quickly. The specialty has some of the lowest complication rates in medicine, and that is not a coincidence.
But anaesthetists still work within a medical system that responds to adverse events by asking who, not why.
What just culture actually looks like
Reason followed the Swiss cheese model with the concept of "just culture," and the name is important. It is not a no-blame culture. Recklessness, substance impairment, deliberate violations: these warrant individual consequences. Just culture draws a line between human error (system response), at-risk behaviour (coaching), and reckless behaviour (discipline).
In aviation, this distinction is codified. A pilot who reports an honest error is protected. A pilot who was intoxicated is not. The line is clear, and because it is clear, pilots report freely. The data flows. The system improves.
Medicine has no equivalent framework that is consistently applied. The same error, in different hospitals, under different investigators, can result in outcomes ranging from a learning conversation to a career-ending finding. This inconsistency is itself a systemic failure, and it drives exactly the silence it should be designed to prevent.
What would need to change
The answer is not complicated. It is the implementation that is hard.
Confidential, non-punitive incident reporting systems that are genuinely trusted by clinicians. Not suggestion boxes. Not systems where the report goes to the same management structure that conducts disciplinary proceedings. Separation of learning from punishment, the same way aviation separates its safety investigation body from its regulatory enforcement arm.
Training in human factors and error management that starts in medical school, not as an afterthought in a hospital orientation. Aviation cadets learn this from day one. Medical students learn anatomy.
And a willingness, at the institutional level, to accept that the doctor who made the error is usually the second victim, not the first cause.
None of this is new. Helmreich wrote about it in 2000. Reason wrote about it in the 1990s. Aviation implemented it decades ago. The evidence base exists. The frameworks exist. What does not exist, in most of medicine, is the institutional courage to stop looking for someone to blame and start building systems that make the error less likely to happen in the first place.
Dr Adam Hill is a specialist anaesthetist (FANZCA) and instrument-rated pilot practising in Sydney and regional NSW. He flies his own aircraft to deliver specialist anaesthetic services to regional communities. Read more about his work in The Flying Anaesthetist.