Cognitive Bias in IT Decision Making

Logic and objective thinking are hallmarks of any engineering field. IT design and troubleshooting are no exceptions. Computers and networks are systems of logic so we, as humans, have to think in such terms to effectively design and manage these systems. The problem is that the human brain isn’t exactly the most efficient logical processing engine out there. Our logic is often skewed by what are called cognitive biases. These biases take many potential forms, but ultimately they skew our interpretation of information in one way or another. This leaves us believing we are approaching a problem logically, but in reality are operating on a distorted sense of reality.

What am I talking about? Below are some common examples of cognitive biases that I see all the time as a consultant in enterprise environments. This is by no means a comprehensive list. If you want to dig in further, Wikipedia has a great landing page with brief descriptions and links to more comprehensive entries on each.

Anchoring: Anchoring is when we value the information we learn first as the most important, with subsequent learned information having less weight or value. This is common in troubleshooting, where we often see a subset of symptoms before understanding the whole problem. Unless you can evaluate the value of your initial information against subsequent evidence, you’re likely to spin your wheels when trying to figure out why something is not working as intended.

Backfire effect: The backfire effect is what happens when someone further invests into an original idea or hypotheses, even when new evidence is learned that disproves the initial belief. Some might call this pride, but ultimately no one wants to be wrong even if it’s justifiable because all evidence wasn’t available when forming the original opinion or thought. I’ve seen this clearly demonstrated in organizations that have a blame-first culture. Nobody wants to be left holding the bag, so there is more incentive to be right than to solve the problem.

Outcome bias: This bias is our predisposition to judge a decision based on the outcome, rather than how logical of a decision it was at the time it was made. I see this regularly from insecure managers who are looking for reasons for why things went wrong. It plays a big part in blame culture. This can lead to decision paralysis when we are judged by outcomes we can’t control, rather than a methodical way of working through an unknown root cause.

Confirmation bias: With confirmation bias, we search for, and ultimately give more weight to, evidence that supports our original hypotheses or belief of the way things should be. This is incredibly common in all areas of life, including IT decision making. It reflects more on our emotional need to be right than any intentional negative trait.

Reactive devaluation: This bias is when someone devalues or dismisses an opinion not on merit, but on the fact that it came from an adversary, or someone you don’t like. I’m sure you’ve seen this one, too. It’s hard to admit when someone you don’t respect is right, but by not doing so, you may be dismissing relative information in your decision-making process.

Triviality/Bike shedding: This occurs when extraordinary attention is applied to an insignificant detail to avoid having to deal with the larger, more complex, or more challenging issue. By deeply engaging in a triviality, we feel like we provide real value to the conversation. The reality is that we expend cycles of energy on things that ultimately don’t need that level of detail applied.

Normalcy bias: This is a refusal to plan for or acknowledge the possibility of outcomes that haven’t happened before. This is common when thinking about DR/BC because we often can’t imagine or process things that have never occurred before. Our brains immediately work to fill in gaps based off our past experiences, leaving us blind to potential outcomes.

I point out the above examples just to demonstrate some of the many cognitive biases that exist in our collective way of processing information. I’m confident that you’ve seen many of them demonstrated yourself, but ultimately, they continue to persist because of the most challenging bias of them all:

Bias blind spot: This is our tendency to see others as more biased than ourselves, and not being able to identify as many cognitive biases in our own actions and decision making. It’s the main reason many of these persist even after we learn about them. Biases are often easy to identify when others demonstrate them, but we often can’t see our own biases when our thinking is being impacted by a bias like those above. The only way to identify our own biases is through an honest and self-reflective post mortem of decision making, looking specifically for areas where our bias impacted our view of reality.

Final Thoughts

Even in a world dominated by objectivity and logical thinking, cognitive biases can be found everywhere. It’s just one of the oddities of the human condition. And bias affects everyone, regardless of intent. If you’ve read the list above and have identified a bias that you’ve fallen for, there’s nothing to be ashamed of. The best minds in the world have the same flaws. The only way to overcome these biases is to inform yourself of them, identify which ones you typically fall prey to, and actively work against those biases when trying to approach a subject objectively. It’s not the mere presence of bias that is a problem. Rather, it’s the lack of awareness of bias that leads people to incorrect decision making.

Thwack - Symbolize TM, R, and C