Skip to content

A Boy Who Cried Wolf: Lack of Trust in Automation

Automation is everywhere: oftentimes we are passive benefactors of automation, e.g. we consume energy produced in highly automated power plants or benefit from automated heating and air conditioning systems in our homes. We also have more direct interactions with automation, e.g. using our automobile cruise control system.

In direct interactions, the human operator and the automation work together as a team. Trust and reliance in the abilities of the system are key for successful interactions. If the operator does not rely on the system, bad things happen. However, if the operator overestimates the capabilities of the system, consequences are even worse.

Lack of Trust

In this article, I want to focus on the former – disuse – or a lack of trust in the system. Problems of a lack of trust occur in situations where the human operator can choose to use the automation and follow its decisions. A good example would be an alarm system such as a smoke detector. If the smoke detector goes on, you can choose yourself if you want to go grab the fire extinguisher, alert the fire brigade or don’t react at all. Studies showed that trust decreases rapidly once the automation produces obvious mistakes. As a result, we can see two consequences in human behavior:

Reliance

Reliance is about putting trust in the system’s ability to detect critical situations and warn the human. Reliance in a system has a major benefit: it enables the operator to focus on other tasks – no need to worry, the alert will be triggered if something happens.

But if you experienced a fire in your apartment and your smoke detector missed it, you won’t have a good night sleep since you can’t trust the automation system anymore. What if the smoke detector will miss the fire the next time? As a consequence you may decide to not rely on the automation anymore and monitor incidents yourself.

Compliance

Compliance means relying on the produced alerts of the system. An operator would be compliant if he would immediately start an action after an alert has been triggered since he trusts the system’s ability to correctly indicate an incident. If you trust your smoke detector, you call the fire brigade pronto.

One of Two Types of Error

Unfortunately, even the best alarm system can’t be right all the time. That’s why designers need to decide on a criterion that defines when the alarm goes on. This decision is a trade-off between two types of error: Misses or false alarms. A miss is a failure of the automated system to detect a critical event (there is a fire, but your smoke detector doesn’t indicate it). A false alarm is an incorrect indication of a critical event (there is no fire, but your smoke detector indicates a fire).

2018 Hawaii False Missile Alert

Another prominent example of a false alarm: 2018 Hawaii false missile alert which lead to “Panic in Paradise” as stated by the media. The alert which was issued via TV, radio and cellphones stated an incoming ballistic missile threat which was later identified as a false alarm due to “human error”.

CBS evening news on 2018 Hawaii False Missile Alert

Oftentimes misses have more serious consequences than false alarms so the designer will decide on a sensitive system that plays it safe – better safe than sorry. Unfortunately, this can lead to a negative side effect, the so-called cry wolf effect.

The Cry Wolf Effect

In short, the cry wolf effect is about non-compliance. The cry wolf effect means that an operator doesn’t react adequately on an alarm due to lack of trust in the alarm system.

The name is derived from Aesop’s fable “The Boy Who Cried Wolf”, telling the story of a shepherd boy who fooled the villagers by crying “Wolf! Wolf!” and asked them to come for help. The villagers run up the hill but there was no wolf. Later the boy cried “Wolf! Wolf!” again, but no one believed him, so they didn’t come for help. When they finally found out that all the sheep were eaten by the wolf, a villager concluded “Nobody believes a liar … even when he is telling the truth!”.

If your smoke detector produces false alarms all the time, you probably won’t trust it anymore…

Overcoming the Cry Wolf Effect

Researchers at the University of Washington, Seattle found cry-wolf effects on weather related decisions (e.g. non-compliance with evacuation due to extreme weather). They also tried to increase the compliance of humans and decrease the cry-wolf effect. Their findings indicate that probabilistic uncertainty estimates (such as 70% probability of freezing) may improve the compliance as well as decision quality. Maybe that’s why most of our weather apps show probabilities of rain?

Just in case you were curious: it actually rained 😉

Sources:
LeClerc, J., & Joslyn, S. (2015). The cry wolf effect and weather‐related decision making. Risk analysis35(3), 385-395.
Manzey, D. (2012). Systemgestaltung und Automatisierung. In Human Factors (pp. 333-352). Springer, Berlin, Heidelberg.
Sheridan, T. B., & Parasuraman, R. (2005). Human-automation interaction. Reviews of Human Factors and Ergonomics, 1(1), 89-129.