Confirmation, Availability and Additivity: Biases That Can Impact Us All
CONSULTANT TO AVANTIS INVESTORS®
As I’m sure many of you have heard, Nobel Prize-winning and pioneering psychologist Daniel Kahneman, Ph.D., died in late March 2024. He, along with Amos Tversky, is largely credited with founding and popularizing the field of behavioral economics, which has had a huge influence on diverse fields ranging from psychology to public policy to law to finance.
So, I thought I’d use this space to reflect on three biases that I’ve been thinking a lot about lately that have their roots in the work that Kahneman and his colleagues started. Two of them are classic — confirmation bias and availability bias. One is relatively new — let’s call it the “additive bias” — and has its grounding in the field that Kahneman helped found.
In all three cases, I love thinking about how the central insights apply to modern-day interactions between advisers and clients and how we might improve our lives (and the lives of our clients) if we attend to them more deeply.
Confirmation Bias: Supporting Existing Beliefs vs. Challenging Them
You’re sitting in a classroom and the professor tells you about a riddle. “I’ll give you three numbers, and you need to guess what the rule is that explains the sequence of numbers. The numbers are … 2, 4 and 6.” Once you’ve thought of the possible rule, you’ll be able to check it with the prof to see if you’re right by asking whether another sequence of numbers fits the rule.
What numbers would you ask about? If you’re like most people, you’re probably thinking, “Even numbers that increase by 2,” and you might say, “Hey, prof, do 6, 8 and 10 fit the rule?” And the prof would respond, “Yes, in fact, it does!” Then you might confidently say, “OK, I’m ready to guess … is the rule ‘even numbers?’”
If this is the scenario you imagined, you’d, unfortunately, have guessed incorrectly because the correct rule in this case is “any series of increasing numbers.” Notice that you could come to this conclusion only if you had asked about a sequence of numbers that differed from your original hypothesis rather than one that was in line with your original hypothesis.
In other words, if you had said to yourself, “OK, I think the rule is even numbers increasing by 2, but just in case I’m wrong, I’ll ask about the numbers 5, 6 and 8,” only then would you have found out that your original guess was wrong.
Except, that’s not how many of us go about our everyday lives. We regularly generate hypotheses to understand the world and come up with assumptions — things like that person thinks I’m funny, or this restaurant is a good spot to go to on a Friday night, or this neighborhood will be a good one for my family to move to. But in testing these hypotheses, we often seek information that confirms our initial hypothesis rather than information that contradicts that hypothesis.
We might only pay attention when our friend laughs at our jokes and fail to consider the times when they don’t; we might ask others if they had a lot of fun at a given restaurant on a Friday night, leading them to answer in the affirmative; or we might overly attend to and overweight the positive aspects of a new neighborhood and fail to deeply consider the negatives.
This so-called confirmation bias was first documented almost 65 years ago, but it seems as relevant today as it was then.1 How often, I wonder, have you sought information that confirms your assumptions rather than information that might act in opposition to a given idea?
How often have you found that your clients do the same? In my own life, I’ve recently tried to counteract the confirmation bias by asking myself and others why a given path of action would be a bad idea rather than a good one.
Availability Bias: Relying on Memory vs. Assessing Reality
Quick: What do you think there are more of on the highways — blue cars or orange cars? If you guessed “blue cars,” you’d be right. OK, now here’s another one: Do you think more people have been killed using a selfie stick or trying to climb Mount Everest? If you guessed Mount Everest, you’d be wrong!
These numbers are admittedly rough ones, but by one estimate, 379 people were killed in selfie stick-related incidents from 2011-2018; by another, 338 people have died trying to climb Mount Everest.2 The tendency to think that Mount Everest has resulted in more deaths, which is at least what I might have said if you asked me this question, results from the same machinery that leads us to correctly guess that blue cars are more prevalent than orange ones.
For the question about cars, I’m betting that you didn’t need to consult DMV records or ask ChatGPT. Rather, you probably did a quick scan of your memory in terms of what you’ve seen recently and then guessed from there. In the same way, when assessing the selfie stick versus climbing question, you might have also consulted your memory — thinking about the number of times you’ve heard reports or read news stories about people who have died on the mountain versus the number of times you’ve heard about someone falling off a cliff while trying to capture the perfect shot.
But while the tendency to draw on what’s available in our memories can often lead us to efficiently guess correctly, you can most likely call to mind many times when it may not. I’m curious: When do you think you (or your clients) demonstrate this availability bias in other, more consequential contexts?
To my mind, every time I read about the newest investing trend or fad, I think this bias may be the prime culprit.3 The tendency, of course, is to call to mind the most recent news about some trend, and then overly weight that information when making a guess about a future state of the world. (The buying patterns surrounding cryptocurrencies come to mind as a prime example!).
One possible antidote: In making decisions that have consequences in the present and the future, before making a choice, try to zoom out and take the 10,000-foot view when analyzing trends and the likelihood that they will continue.
Additive Bias: Adding Elements vs. Subtracting Them
My daughter is 8, and my son is 4 (sorry, I should say four AND A HALF), and while the older one has learned to ride a bike already, my son is still getting used to riding a balance bike. If you haven’t spent much time around kids lately, then you might not know what a balance bike is.
Essentially, it’s a little bike without pedals. Kids sit on the seat and kind of walk themselves along until they eventually learn how to balance before they eventually upgrade to a bigger bike with pedals. Think about how this model differs from the way that most of us probably learned how to ride a bike.
I, for one, had training wheels, which eventually got taken away before I had to figure out how to pedal and balance all at the same time. The thinking back then was: If you want to teach a kid how to ride, add some training wheels. The thinking now, by contrast, is to subtract the pedals, which teaches a kid how to balance, and then eventually add back the pedals.
What’s striking to me about this innovation is not only how effective it is at transforming kids into cyclists but also how long it took for it to be introduced. As demonstrated by Gabrielle Adams, Leidy Klotz and their colleagues, when thinking about ways we can make improvements for our organizations or our lives, we often gravitate toward additive changes and, in their words, systematically overlook subtractive changes.4
We naturally tend to think about the things we can add to our lives and fail to consider the ones we can remove when, in reality, additions aren’t necessarily superior to subtractions. (Klotz has an excellent book on this topic, aptly titled “Subtract”).
This additive bias, as I like to refer to it, has only been recognized by academics in the last few years. Ever since reading about it, I have seen it pop up again and again in my life and the lives of my students and colleagues. I’d bet that you’ll notice it now, too, both for you and for your clients.
Just think about how many times you (or a client) have tried to make an improvement by way of addition rather than subtraction. To rebalance things, I’ve been trying to at least notice my tendency to engage in this sort of thinking and then come up with some things I might remove from a given situation to make it better.
My description of these three biases just scratches their surface and nuance. My hope, though, is that by spotlighting them, I might prompt deeper consideration of when they rear their heads and how they might be counteracted. If you’ve noticed them in your life or the lives of your clients, and if you’ve come up with clever solutions, please share them with us.
1 P.C. Wason, “On the Failure to Eliminate Hypotheses in a Conceptual Task,” Quarterly Journal of Experimental Psychology 12, No. 3 (1960): 129-140.
2 Sydney Morning Herald, “Selfie-Taking Tourist Rescued After Falling Into Mount Vesuvius, Near Pompeii,” July 13, 2002; Climbing Kilimanjaro, “Everest Dead Bodies: How Many People Have Died on Mount Everest,” accessed March 27, 2024.
3 Gunjan Banerji, “How I Got Hooked on the Hottest Trade in Markets – and Bagged a 2,000% Return,” Wall Street Journal, March 16, 2024.
4 Gabrielle S. Adams, Benjamin A. Converse, Andrew H. Hales, and Leidy E. Klotz, “People Systematically Overlook Subtractive Changes,” Nature 592 (2021): 258-261.
The opinions expressed are not necessarily those of Avantis® Investors. This information is for educational purposes only and is not intended as investment advice.