Your Obsession With Cognitive Biases Is Probably Making You Dumber

getty_622315182_2000149620009280238_364524.jpeg

Thanks to the popularity of books like Nudge, as well as the Nobel Committee's recognition of behavioral scientists Daniel Kahneman and Richard Thaler, cognitive biases are having a bit of a moment. Articles round them up, dedicated rationalists vow to chase them from their thinking, and companies and governments are even using the latest findings in the field to design incentives and shape policy.

What could be wrong with that? Given how far we are from the rational actors of classic economics, won't trying to understand our mental quirks lead to smarter choices?

You'd think the answer would be yes, but according to a fascinating article by Koen Smets in Behavioral Scientist recently, when it comes to cognitive biases, a little knowledge can be a dangerous thing. Snappy popular write-ups lead people to misunderstand biases, he insists, which can result in spectacularly dumb decisions.

What happens when people think they understand behavioral science 

The prime example of a hastily designed and ultimately disastrous program based on half-baked knowledge of cognitive biases is United Airlines' brief and much maligned decision to swap small, guaranteed employee bonuses for a lottery system in which some workers would win big prizes but most would get nothing.

The brass at United made the switch based on their understanding of a well known cognitive bias that shows we tend to overestimate small probabilities, leading people to irrationally prefer a 1-in-100 chance of winning $100 to a guaranteed windfall of one dollar. That's the theory, but when United actually tried to put the idea into practice, employees revolted and the company was forced to quickly change back to their original scheme.

Cognitive biases are more complicated than you think.

The bias United was leveraging is real, so what went wrong? Smets offers a handful of reasons that nudges based on behavioral science regularly fail in the real world. The first is simply that the science is in flux. An effect found by one study often fails to appear again when scientists try to retest the idea. What works in one context doesn't work in another.

Some of that can be explained by poor study design, but a lot of the inconsistency is down to the fact that this is a new field working on an immensely complicated subject -- the intricacies of the human mind. Biases can work in conflict with each other, so which one dominates often depends on the person and situation.

"When we make a choice, are we influenced by what we saw first (priming or anchoring) or what we saw last (recency)? Are we influenced by what we know and are familiar with (status quo) or by what is new, shiny, and different (novelty)?," asks Smets to illustrate. "Will people pursue a new initiative with great enthusiasm and perhaps obliviousness of the potential downsides (optimism), will they be held back by the perceived downsides (risk and loss aversion), or will they follow what their colleagues do (social proof)?" 

Furthermore, some biases aren't really all that irrational after all. You may have heard of the "paradox of choice." It says that while a rational person would always prefer more options, real life people often find many alternatives mentally taxing and avoid making a decision. Studies on simple goods like jam back this up. Offer people more choices and they buy less. But try the same experiment with cars or vacations and things get more complicated.

"People prefer fewer choices for utilitarian purchases and more choices for hedonic purchases. When we buy something only for its functional utility, we don't want to spend much time comparing various options--whatever does the job is good enough. When we are looking for something that will give us pleasure, in contrast, our preferences are more specific and pronounced, and this makes us more demanding," explains Smets (if you've ever spent hours on hotel review sites this shouldn't surprise you). "A little reflection suggests that none of this is particularly irrational." Similarly, in situations where resources are scarce, so-called "loss aversion" makes a ton of sense.

A plea for humility

The bottom line is cognitive biases are important and fascinating, but that this is an area of science where popular overconfidence is particularly widespread and destructive. As an antidote Smets offers a plea for more intellectual humility.

"Beware of oversimplification. Learning the names of musical notes and of the various signs on a staff doesn't mean you're capable of composing a symphony. Likewise, learning a concise definition of a selection of cognitive effects, or having a diagram that lists them on your wall, does not magically give you the ability to analyze and diagnose a particular behavioral issue or to formulate and implement an effective intervention," he sternly lectures amateur biases enthusiasts.

So go ahead and share that infographic of cognitive biases and feel free to tinker around with different incentive schemes at your business as long as you look out for real-world feedback and adjust accordingly. But don't think that just knowing about cognitive biases is going to translate into an instant rationality boost. Sorry, humans are way too weird and complicated for that.
 

Jessica Stillman

Previous
Previous

Why Religion Makes People Take More Risks

Next
Next

Why We Must Share Stories of LGBTQ+ Entrepreneurs Succeeding