Health Systems Action

It’s not your fault! The cognitive basis of medical error

A reflection on human attention and patient safety

At the October 2025 American Society of Anesthesiology (ASA) meeting Dr Joyce Wahr delivered the prestigious John W. Severinghaus Lecture with a message for all of healthcare: most errors are not the result of carelessness, incompetence or insufficient effort. They are predictable outcomes of how the human brain works.

A quick test

Before we go further, let’s try an experiment.

Below are images from chest CT scans – cross-sectional x-ray “slices” through the chest. In lung-cancer screening, radiologists scroll through hundreds of these for each patient.

You’ll do a much shorter version of that task.

To simulate what a radiologist does, you will have 12 seconds total – about 2 seconds per slice – to look through the five CT images below.

Your goal is simple:

Look for lung nodules – small, roundish, white spots within the darker lung fields.

Most are benign; some can be early cancer.

Ready? Set a timer if you like.

Image 1 of 5:

Image 2 of 5:

Image 3 of 5:

Image 4 of 5:

Image 5 of 5:

What you might not have seen

In Image 5, look closely at the right-hand side of the image (the patient’s left lung).

Did you spot it?

It’s a gorilla – a small, digitally inserted black silhouette with a white outline.

This peculiar exercise was the basis of a 2013 study in which twenty-four experienced radiologists were asked to examine five separate multi-slice CTs[1], each containing an average of 10 nodules, and to click on the nodule locations that they identified with the computer mouse. The gorilla was present in a sequence of five slices in the last set of images.

83% of the radiologists reported nothing unusual – no gorilla – even though eye-tracking showed most of them looked directly at it and despite the lost primate being about 50 times bigger than the average lung nodule.

Here’s the same scan with the gorilla highlighted:

Here’s the eye-tracking software in action, showing that the radiologist focused, at least for a moment, on the location of the gorilla. The blue circles represent eye positions recorded at 1-millisecond intervals:

This is a demonstration of how attention really works, and why system-level safety design matters as much or more than personal vigilance.

Why we miss the obvious

Human perception is not a camera. It’s a filter, constantly discarding what the brain decides is irrelevant.

When given a specific task (“find lung nodules”), attention focuses in on that goal. Anything unexpected – even something absurd and obviously out of place – may never reach conscious awareness. This phenomenon is known as inattentional blindness.

It’s powerful enough that:

  • People counting basketball passes fail to notice a person in a gorilla suit.
  • People recall the time on a bedside alarm clock – even when the clock hands are missing.
  • Eyewitnesses to the same event often give strikingly different descriptions.

These aren’t failures of discipline or intelligence.

They are normal functions of human cognition.

Effort is not a safety strategy

Common responses to clinical error often sound like this:

  • “Be more careful next time.”
  • “Read the label.”
  • “Double-check.”
  • “Slow down.”

But these rely on a false assumption – that attention is unlimited.

In reality:

  • System 2 thinking (deliberate, analytical) is slow, limited, and cognitively exhausting.
  • System 1 thinking (fast, automatic, pattern-based) handles most daily work and can be tricked by expectations.

You can’t “try harder” to override these limitations.

This is why Dr Wahr titled her lecture: “It’s Not Your Fault!”

When cognitive limits become clinical hazards

One of the most devastating examples that Dr Wahr discussed in her lecture involved cases where tranexamic acid (TXA), an intravenous drug used to slow or prevent bleeding, was injected instead of bupivacaine, the most used local anaesthetic, into the spinal canal. TXA is highly toxic in that part of the body; most patients die.

TXA and bupivacaine vials can look similar.

They may sit close together on a crowded procedure tray.

They appear in predictable workflows.

In many tragic cases, clinicians insisted they had read the label.

System 1, however, has already supplied the mental model:

“During a spinal anaesthetic, the drug in my hand is bupivacaine.”

When reality contradicts expectation, System 1 tends to reshape perception to fit.

This is not negligence, it is neurobiology.

The strongest safety barrier: forcing functions

If we can’t rewire the human brain, we must design systems that prevent harm.

These are forcing functions – interventions that make the wrong action impossible or extremely unlikely:

  • Supplying TXA only in IV mini-bags, which can’t be connected to a spinal needle
  • Non-interchangeable connectors that prevent wrong-route administration
  • Pre-filled syringes to eliminate error-prone preparation steps
  • Standardised medication trays
  • Storing TXA vials in separate carts outside the operating room or in locked bins that require barcode scanning to remove from the bin

These strategies work regardless of fatigue, workload, stress or distraction.

They are safety by design.

Helping the brain succeed: standardisation and cognitive aids

Standardisation reduces unpredictability but it must be universal across the clinical work environment. Partial standardisation, for example standard drug trays in the main operating theatre complex with different versions in other areas, such as the obstetric or interventional radiology or suite, can understandably increase risk.

Cognitive aids such as emergency manuals support clinicians in crisis situations when analytic thinking under pressure becomes difficult and System 1 tends to dominate.

What the gorilla teaches us

If highly trained radiologists can miss a gorilla in a CT scan, then all of us miss things every day – often without realising.

The lesson is simple:

Errors are not moral failures.

But failing to design systems that anticipate human limitations is.

Healthcare doesn’t become safer by asking clinicians to “try harder.”

It becomes safer by:

  • redesigning workflows
  • reducing reliance on memory
  • introducing forcing functions
  • building environments that support human beings as they are.

That’s the cognitive basis of error and the foundation of modern patient safety.

Readings

Drew, T., Võ, M. L.-H., & Wolfe, J. M. (2013). The Invisible Gorilla Strikes Again: Sustained Inattentional Blindness in Expert Observers: Sustained Inattentional Blindness in Expert Observers. Psychological Science, 24(9), 1848-1853.

Simons DJ, Chabris CF. Gorillas in our midst: sustained inattentional blindness for dynamic events. Perception. 1999;28(9):1059-74.

Kahneman D. Thinking, fast and slow. New York: Farrar, Straus and Giroux; 2011.


[1] Experienced radiologists typically spend 3-10 minutes interpreting a screening chest CT “stack” of between 100 and 500 slices. In the study, the gorilla-containing stack had 239 slices.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top