An intuitive sense of doubt, even when we err

Written by Eric D. Johnson
Edited by Gaëlle Vallée-Tourangeau

Icon made by Vectors Market from www.flaticon.com

You’re pretty smart, right?

Well, if you’re keeping up with popular literature, it can be discouraging to find top news outlets and bestselling books ranting about the inherent limitations of our minds. Titles like “We’re all a little biased, even if we don’t know it”, “15 cognitive biases that mess with your money”, and “Why smart people are so stupid” routinely make their way into places like The New York Times, Business Insider, and The New Yorker.

Need an example? Quickly answer the following question:

A bat and a ball together cost $1.10.
The bat costs $1 more than the ball.
How much does the ball cost?

It’s 10 cents, right? Not so fast! If the ball were 10 cents, and the bat cost one dollar more, the bat would cost $1.10. Add that to the 10-cent ball and you’ve got $1.20. Oops.

If your intuition was off here, don’t worry. The vast majority of students from top universities like MIT and Harvard get this bat-and-ball problem wrong too. And I doubt we need to throw ourselves into a panic about MIT student’s math skills.

But what is going on here? One of the most popular accounts of these errors is that people simply fail to realize they’re making a mistake. Much to the contrary, our research shows that not only do people feel they’re making logical mistakes, but that this doubt stems from effortless intuition.

Of two minds: Blame it on the mental miser

The most famous explanation for these failures of human thinking comes from Nobel Prize winner Daniel Kahneman. According to his popular dual process view, there are two main types of human thinking—a fast and intuitive System 1, and a slow and lazy System 2.

As Kahneman puts it in his bestseller Thinking, Fast and Slow:

“The mental shotgun [System 1] makes it easy to generate quick answers to difficult questions without imposing much hard work on your lazy System 2 (pg. 99).”

As we go about our days, System 1 thinking automatically provides quick intuitions to problems we encounter. With routine tasks, this quick thinking works well. But thinking fast can lead us astray in other situations, as we saw with the bat-and-ball problem above. In these situations, a mentally demanding System 2 is responsible for monitoring the System 1 answer and making any necessary corrections. The problem, according to Kahneman and other dual process proponents, is that System 2 is often quite lazy, and can let the System 1 answer go unchecked.

So, it’s not that we are unable to understand the logic of the problem. Once you saw that $1.10 + 10 cents doesn’t equal $1.10, I doubt you objected. The problem is that people don’t realize that they’ve made an error in the first place.

But are we really so blind?

Recent studies on decision making have begun to probe people’s “lazy” responses. One way to do this is by asking how confident people are with their response on problems like the bat-and-ball problem. First, here’s another one for you:

A magazine and a banana together cost $2.90.
The magazine costs $2.
How much does the banana cost?

If a quick feeling told you it’s 90 cents, this time your intuition was correct. And if you ask people how confident they are that this is the correct answer, most will say they are completely confident.

Now go back to initial the bat-and-ball problem. Curiously, people who give the wrong answer on that problem say they are less confident with their response. It’s as if they know something’s not quite right, even though they’re not sure exactly what it is. So people are not quite as blind to their mistakes as popular views would have you think.

In our recent research, we took this one step further by asking where this signal of doubt originates from. According to one view, the resource-demanding System 2 is responsible for monitoring intuitive responses and detecting any mistakes. This means that the reduced confidence expressed by people who answer incorrectly would originate from System 2. Another possibility, however, is that these feelings of doubt occur automatically as a result of an intuitive System 1 process.

To test this, we wanted to “knock out” System 2 and force people to rely on System 1. So we gave people either the “difficult” bat-and-ball problem or the “easy” magazine-and-banana problem while they simultaneously memorized a complex pattern of dots. We used three levels of difficulty to make sure we overloaded the effort-demanding System 2, thus forcing people to rely on quick intuitions. After answering, they had to say how confident they were with the response they gave.

So what did we find? First, with the more complex dot memorization, almost no one correctly answered the question. This means the memorization task effectively knocked out their System 2 resources. Second, even with System 2 disabled, people still said they were less confident with their incorrect answer to the bat-and-ball problem. They were also slower to give both their answer and confidence, indicating additional hesitation with their response. That is, this signal of doubt appeared even when System 2 was essentially rendered useless by the cognitive load. Together, these findings mean that people were detecting their error quite automatically, suggesting that this resulted from a System 1 process.

Moving forward

A popular belief is that humans are poor reasoners because we blindly accept our intuition without question. Contrary to this idea, our work shows that not only do we question our erroneous intuitions, but that this doubt arises quite automatically.

While our work establishes that System 1 monitoring is automatic, it doesn’t speak to the specific nature of the cues that trigger this. Figuring this out will be work for future studies. Furthermore, we want to be clear that even though they have doubts, most people still answer these problems incorrectly, so simply doubting is not sufficient for triggering more deliberative processing of the solution.

In short, although we often fail to correct our mistakes through the necessary hard thinking, detecting our errors appears to be relatively intuitive. By signaling its doubt along with a biased intuition, it appears that System 1 is smarter than traditionally assumed.

SaveSave

SaveSave

SaveSave

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s