Skip to content

A Tale of Accidents & Swiss Cheese

What is the most typical thing for Switzerland? One might think of the Alps, the to die for chocolate or – right- Swiss Cheese!

I spent the last weekend in Zurich, Switzerland. Unfortunately, my flight was delayed by more than 4 hours which allowed me to work on this post covering a traditional Human Factor’s theme (accidents) with a swiss touch!

Today, I want to introduce you to the Swiss Cheese Model of Accidents. This model or metaphor was coined by James Reason in his 1990 book “Human Error”.

If accidents happen – e.g. in aviation – people often want immediate answers on how the accident happened and who is responsible. We love straightforward explanations as well as “blaming and shaming”. Human Factors experts would label this behavior as a person approach where we analyze errors of operators at the sharp end.

The truth is, most accidents have diverse and complex contributing factors and errors are oftentimes consequences of underlying actions. The so-called systems approach is considering this fact as well as that to err is human and that errors need to be considered in a safe system. Following a system approach we have to concentrate on conditions rather than individuals and build defenses – that’s exactly what the Swiss Cheese Model by James Reason is demanding.

“We cannot change the human condition, but we can change the conditions under which humans work.”

James Reason

Reasons’s model consists of several slices of cheese lined up horizontally. The cheese slices represent different system layers which are barriers or defenses against failure and protect from hazards. In some variations of the model, the layers are specified to organizational influences, supervision, preconditions and specific acts.

But why is Reason talking about Swiss Cheese? Well, our swiss cheese slices representing layers of a system have holes or imperfections.

Those weaknesses in the system can be short-term gaps created by front-end operators (active errors) or long-lasting systemic weaknesses caused by designers, maintainers or decision makers (latent errors). Those latent errors can lie dormant in the background.

Losses or accidents occur when the holes of the different slices align to a trajectory and the hazard can pass through all slices. That means that an operator at the sharp end performing an unsafe act is only the hole in the final layer. Stand-alone it might not result in a failure but being part of a series of previous failures accidents occur.

Swiss Cheese Model by James Reason (2000), find image source here

I’ll give you an example from passenger aviation (remember me waiting for my delayed flight? 😉 ): If there is an accident people rush into blaming the operator at the sharp end (the pilot) behaving in a specific act such as improper communication between the pilot and the co-pilot or the pilot and the tower. This behavior is then a hole in the “specific acts” cheese slice.

But we also have to consider the other slices. Holes in the “preconditions” slice could be a stressed or tired crew or an incorrect pairing of the crew. Even more distal is the “supervision” layer which could have holes representing insufficient training. Last, we have to consider “organizational influences”. Holes in this cheese slice could for instance be due to cost cutting or a focus on growth of the airline.

I found this cool infographic where the swiss cheese model is applied to aviation by

The Swiss Cheese Model was rapidly adopted by practitioners and is widely used as a heuristic communication device. As a generic tool it can be applied to many areas such as healthcare and aviation. The simple metaphor makes it easy to remember and easy to pass on. It was successful in spreading the idea that no one failure is enough to cause an accident.

Still, the model has its limitations and one would be well-advised to understand the Swiss Cheese Model as a metaphor rather than a model that can be used for specific predictions or analyses.

Reason, J & Hollnagel, Erik & Pariès, Jean. (2006). Revisiting the “Swiss Cheese” Model of Accidents. Journal of Clinical Engineering. 27. 110-115.
Reason J. (2000). Human error: models and management. The Western journal of medicine172(6), 393–396. doi:10.1136/ewjm.172.6.393

This is me in the cockpit of a A319 on my way to Zurich 🙂