I’d gamble that Buddha’s revelation was accompanied by a howl of, “What in the heck was I thinking?” Our worst stupidities tend to be chaperoned by confidence. Our clearest enlightenment seems to arise from dead ideas. Certitude is foolish. Probably. Bias is everywhere. That’s what I suspect, at least.
From the University of Toronto Magazine I hear the beat of a familiar drum. Yes, Kurt Kleiner writes on a subject I’ve covered before: our natural disinclination towards rationalism or, as Prof Keith Stanovich terms it, “dysrationalia“…
We are all “cognitive misers” who try to avoid thinking too much. This makes sense from an evolutionary point of view. Thinking is time-consuming, resource intensive and sometimes counterproductive. If the problem at hand is avoiding the charging sabre-toothed tiger, you don’t want to spend more than a split second deciding whether to jump into the river or climb a tree.
So we’ve developed a whole set of heuristics and biases to limit the amount of brainpower we bear on a problem. These techniques provide rough and ready answers that are right a lot of the time – but not always.
We look for evidence that confirms our beliefs and discount evidence that discredits it (confirm-ation bias). We evaluate situations from our own perspective without considering the other side (“myside” bias). We’re influenced more by a vivid anecdote than by statistics. We are overconfident about how much we know. We think we’re above average. We’re certain that we’re not affected by biases the way others are.
This has little to do, it’s claimed, with our intelligence. (One’s mind flits irresistibly to the gifted but thick-as-concrete-porridge Apprentice contestants.)
In 1994, Stanovich began comparing people’s scores on rationality tests with their scores on conventional intelligence tests. What he found is that they don’t have a lot to do with one another. On some tasks, there is almost a complete dissociation between rational thinking and intelligence.
Being sceptical, it seems, is damned hard work…
Fortunately, rational thinking can be taught, and Stanovich thinks the school system should expend more effort on it. Teaching basic statistical and scientific thinking helps. And so does teaching more general thinking strategies. Studies show that a good way to improve critical thinking is to think of the opposite. Once this habit becomes ingrained, it helps you to not only consider alternative hypotheses, but to avoid traps such as anchoring, confirmation and myside bias.
This is why I feel the loose “skeptics movement” should encourage the method of scepticism rather than adopting pet scapegoats. Firstly as this would avoid a sneaking groupthink: deconstructing smugness and challenging the assumptions that have been allowed to fester. Secondly as it would encourage people to take note of their own biases and rebuild their worldviews – as far as one can be expected to – with a more reliable set of tools. Hopefully this would mean some fairly independent, relatively self-aware thinkers would latch on various topics, problems and, yes, bullshit artists of their own accord.
Stanovich goes on to propose a “rationality quotient“, but even if one could achieve this I’d advise against it. Nothing would be more effective at encouraging smugness, prejudice and conformation bias as telling someone they’re rational.