Reply to comment
Human reasoning is biased and illogical. At least that's what a huge body of psychological research seems to show. But now a psychological scientist from the University of Toulouse has come up with a new theory: that logical and probabilistic thinking is an intuitive part of decision making, only its conclusions often lose out to heuristic considerations.
The traditional view is based on studies showing that when it comes to certain types of decisions, people blatantly flout the maths in favour of stereotypes. Consider, for example, the following question:
A psychologist wrote thumbnail descriptions of a sample of 1000 participants consisting of 995 females and 5 males. The description below was chosen at random from the 1000 available descriptions:
Jo is 23 years old and is finishing a degree in engineering. On Friday nights, Jo likes to go out cruising with friends while listening to loud music and drinking beer.
Which one of the following two statements is most likely?
a) Jo is a man
b) Jo is a woman
You only need a basic grasp of probability to see that b) is the correct answer. Yet many studies have shown that even educated people tend to go with the stereotype and choose the wrong answer to this type of question. The obvious conclusion is that people simply don't use logic or probabilistic thinking when making certain kinds of decisions.
Babies have been shown to have a sense for proportionality.
However, Wim De Neys suggests that when reading questions like this one people do access a logical gut feeling which tells them that something about their heuristic response isn't quite right. "People will be aware that there is something fishy about their heuristic response, but they will not be able to put their finger on it and explain why their response is questionable," he writes in a paper published in Perspectives on Psychological Science.
De Neys bases his suggestion on research that contrasted questions like the one above with versions in which there's no conflict between the heuristic and the correct response. For example, if in the question above the sample consisted of 995 men and 5 women, then stereotype and maths would both give you the same answer. Studies have shown that people presented with the conflicting versions react differently than those looking at the non-conflicting ones. They take longer to answer the questions, spend more time inspecting them visually and also activate a part of their brain that deals with conflict resolution. This shows that they sense that there's conflict, even if they don't act on it.
A possible explanation is that people do engage in some elaborate analytical thinking alongside their heuristic response. If that's the case, you'd expect that they are to some extent aware of it and that there are individual differences: if conflict detection depends on reasoning, then better reasoners should be better at detecting the conflict. However, there's evidence that neither is the case. When asked to think aloud while making their decisions people didn't tend to mention things that would be crucial in elaborate reasoning. And even the most biased thinkers seemed to detect the conflict.
De Neys argues that this, as well as other evidence, suggests that the logical response is intuitive. Its role might be to signal when there's a problem with the heuristic response and that more elaborate thought is necessary. De Neys points to independent research which supports the idea of an intuitive mathematical sense. For example, in one study 8-month old babies were shown a person drawing red or white balls from a box, with the contents of the box being revealed afterwards. The babies appeared surprised when a red ball was drawn even though there were more white balls than red ones in the box, suggesting that young infants already have an understanding of proportionality.
More research will be needed to test De Neys' ideas and to find out why the logical intuition, if we do indeed have it, so often looses out to heuristics. Ultimately, the results may help us help people to make better decisions.
You can read De Neys' paper here.