
Avoiding Cognitive Biases to Stay Safe
By Josh Williams, Ph.D.
Cognitive biases are systematic errors in thinking that occur as we process and interpret the world around us. In our increasingly fast-paced world, our “need for speed” from a mental processing standpoint is necessary. In fact, it’s an advantage and a sign of intelligence. However, it also causes problems because our big brains have limitations.1 We’re making up to 10,000 decisions every day and our brains use shortcuts to avoid being overwhelmed. Also, we make mistakes when we’re in a hurry. So, we fall back on what’s worked well in the past and make quick decisions that may not be ideal in the present.2
Here are a handful of cognitive biases that may compromise our safety. Many of these are drawn from Daniel Kahneman and Amos Tversky who were pioneers in studying cognitive biases in the domains of behavioral economics and psychology.3
- Confirmation bias is the tendency to seek out, interpret, and pay attention to information that confirms your own preconceptions. In politics, the news channels people watch reinforce their existing belief system. Basically, new information is slotted into existing opinions without much mental effort. This can also happen with safety. Leaders may view new information through their existing lenses.
For instance, if a newer employee is struggling on the job, a leader may think “this new generation doesn’t have the same work ethic we had” without digging deeper into that person’s challenges (training gaps, poor supervision). Confirmation bias hinders improvement because new ideas and methods are dismissed as not fitting in with the existing mental paradigm.
- Conservation bias is like confirmation bias. In this case, people fail to adjust their current beliefs even when they are presented with contrary evidence. As an example, a famous basketball player (from Duke University) recently suggested that the earth might be flat. New information like photos of the earth from space (or a lack of drone footage showing “the edge” of the earth) haven’t changed his opinion. In safety, people may be reluctant to change or try new techniques because “this is how we’ve always done it.” This stifles growth and innovation.
- Escalation of commitment occurs when people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. This can happen several ways that impact safety. New equipment purchases may not be right for the job, but leaders may be reluctant to change them out because of the money already spent. Subpar performers for safety may be retained because investments have already been made in their development. New programs or software may be kept despite vocal complaints from the people impacted. Changing course is sometimes required, even when it’s painful, to combat the escalation of commitment with bad ideas.
- Fundamental attribution error represents the tendency for people to overemphasize person-based explanations for others’ subpar performance while attributing lower self-performance to outside, situational factors. Here’s a golf example. If you’re on the driving range and see someone hitting the ball poorly, the attribution is likely to be “they stink.” However, similarly poor shots you hit may lead to situational attributions like “the wind is throwing me off,” “these mats are worn out,” and “my clubs are terrible.”
This has implications for incident analysis in safety. We quickly note system factors like time pressure, insufficient personnel, etc. if we have an injury. However, we’re likely to attribute the same injury with someone else as “they’re dangerous,” or “they don’t pay attention.” When leaders conduct incident analyses, this bias makes them more likely to blame the employee instead of fully considering system factors contributing to their incident.
- Hyperbolic discounting illustrates the tendency for people to have a stronger preference for more immediate payoffs relative to later ones. This leads to people making choices today that their future selves would have avoided. We often take safety shortcuts because they are easier, faster, more convenient, and more comfortable. Standing on a chair instead of a ladder to hang a picture is a simple example. Discussions with people who have been seriously injured on the job reinforce hyperbolic discounting. If these individuals could go back in time, they would have taken the extra effort to be safe even though it was less convenient to do so in the moment.
- In group bias reflects the perception that your team alone has the right answers. Echo chambers often exist because groups of people convince themselves that their ideas are correct and that dissenting opinions are wrong. Compounding this is the false consensus bias where people overestimate the degree to which others agree with them. For safety, leaders need to spend more time in the field interacting with employees with an open, curious mindset. This helps them get a better understanding of employees’ suggestions and concerns that may not be fully aligned with their current beliefs (plus employees gain a better understanding of your perspective).
- Risk compensation is the tendency to take greater risks when perceived safety protections increase. People with 4-wheel drive may speed in icy conditions because they feel protected. Wearing FR clothing may provide a false sense of security for the choices people make doing electrical work. Leadership messaging needs to reinforce the importance of safe work practices even when other protections have been put into place.
- Reactance describes the urge to do the opposite of what someone wants you to do so that you can maintain your sense of freedom. Teenagers who rebel may do so, in part, to exert their own freedom of choice. Workers who feel micromanaged may intentionally bypass procedures because they don’t want to be “told what to do.”
Leaders need to hold people accountable for safety. However, they should provide more freedom of choice by getting significant employee input when creating safety rules and making decisions that impact safety. We did a NIOSH funded study years ago that showed that employees in an organization who created their own behavioral safety checklist used them 7 times more than another group (in the same company) that was given an existing checklist and were told to fill them out.
- Normalcy bias is a cognitive bias which leads people to minimize threat warnings. There may be considerable hazards or even close calls that fail to sufficiently move people to make changes. People simply struggle to visualize the potential serious consequences of certain actions. “We’ve always done it this way.” Leaders often bring in public speakers who’ve had serious injuries to try and “wake people up” to the potential ramifications of human error. Leaders should find creative ways to discuss risk potential (videos, testimonials) beyond just traditional safety training or sharing company injury statistics.
Understanding and mitigating these cognitive biases is critical for maintaining a safe work environment. Pay attention to your own “self-talk” and make sure these biases don’t creep into your decision-making. Also, it’s important to seek out dissenting opinions (outside of your inner circle) and encourage more open dialogue in the field to combat these mental mistakes.
References
1. https://www.verywellmind.com/what-is-a-cognitive-bias-2794963
2. https://www.safetyandhealthmagazine.com/articles/11176-safety-leadership-removing-cognitive-bias-from-safety-decisions-three-steps