In this post we’ll consider three of the most widespread (and misleading) of our evolved mental modules. We’ll look at ‘selective abstraction‘, ‘arbitrary inference’ and ‘confirmation bias’. Each of these is related in its own way to pattern recognition as described in part 16.
What’s most interesting from an evolutionary perspective is that these three aspects of human psychology, although universal, may not be advantageous in themselves. They may, in fact, be no more than evolutionary by products of pattern recognition.
There are many examples of by products, both physical and psychological. Certain genes seem to confer a variety of traits as though some evolutionary advantages cannot exist without other less positive or neutral correlates. The trade off between sickle cell anemia and protection from malaria discussed in part 9 is an excellent example. Evolution isn’t perfect and so neither is the human body – or the human mind.
Sometimes these extra ‘add on’ characteristics can fool us. They look like the evolved characteristic that was favoured by natural selection but they’re not – they’re just the baggage that comes along with it. They’re what Stephen Jay Gould described as evolutionary ‘spandrels’.
‘Spandrels’ are the triangular blank spaces we see at the top of arches. They serve no structural purpose but in an arched building, a cathedral for example, they are inevitable by products of construction. However some spandrels, structurally useless by products or not, are so finely decorated that it’s easy to convince ourselves that they were the builders’ main focus and not the arch. So it is with evolutionary spandrels. They’re purely coincidental but they have the appearance of selected traits.
The tendency to see patterns and to concentrate only upon what is relevant has real advantages. Work on expertise and efficiency shows us the need to weed out irrelevancies from attention but along with that undeniably helpful tendency we find a few ‘spandrels’. As we consider these three ‘less than helpful’ psychological traits it will be helpful to remember that ‘evolved’ doesn’t necessarily mean ‘inevitable’. It does seem to be possible to resist these traits, however hard-wired they may be. As ever, knowledge is power.
As we know, experts and other, efficient problem-solvers develop a consistent ability to see relevant patterns without being distracted by irrelevancies. For most of us though there seems to be an equally consistent problem, at least until we gain experience in understanding just what is and is not relevant. We tend to focus as much upon irrelevancies as we do upon the relevant.
Experts rely upon experience to recognise which aspects of the current situation are relevant. They use memory of previous events to find meaningful patterns in the present. They notice similarities and take action accordingly. As do we all.
The problem comes when non-experts (that means all of us most of the time) try to identify meaningful patterns. We tend to notice what fits our existing assumptions and worldview. We also tend to ignore or under-emphasise any information that doesn’t fit. In short we selectively abstract the evidence that we already agree with and filter out everything else.
This means that whatever we first believe tends to stick (even if it’s wrong). That’s why no amount of contrary evidence will dissuade the racist, the religious extremist or the political zealot from their chosen opinion. They literally discount all the information that doesn’t support their preconceptions. It’s also why humans are so easy to manipulate. All the manipulator has to do is set up a worldview and we do the rest.
Psychological priming is a common sales and persuasion trick. Prime the prospect to think of a certain price range so that whatever is offered is seen through that particular lens.
In politics priming is just as easy. For example the current UK government has successfully primed much of the population to believe that disabled people are ‘scroungers’ and that unemployed people are lazy. Neither of these assertions are objectively true but that doesn’t matter. Selective abstraction means that these ideas are difficult to shake.
The same is true of labelling. For example, tell someone how generous or how clever they are often enough and there’s a good chance that they’ll look for opportunities to prove it (providing, of course, that they like the label you give them). Even our self-concept is open to manipulation by those who know how to use selective abstraction to full effect.
Selective abstraction is a psychological spandrel that really does have serious consequences.
Confirmation bias is very much related to selective abstraction. Indeed, if confirmation bias is the ‘intention’, selective abstraction is the ‘mechanism’ by which we lead ourselves astray. It’s possible that confirmation bias and the perceived need to be right (or to be seen to be right) is also related to the drive for status and dominance but it’s no less dangerous for that.
We touched upon the process of confirmation bias in part 2 when we discussed what Karl Popper described as the demarcation problem: the difference between science, psuedoscience and nonsense. Science works because it only accepts what it cannot disprove and the scientific method is all about sincerely attempting to disconfirm hypotheses. In so doing scientists are careful to consider all the available evidence. Psuedoscience and nonsense focus only upon confirmatory evidence via selective abstraction and so fail to make meaningful new discoveries.
Confirmation bias serves to blind us to reality and exacerbates our mistakes.
The ability to plan ahead, to imagine the future is a rare trait within the animal kingdom. It’s been a real advantage to humans. It’s also based upon the ability to recognise patterns and to predict what they will look like over time. And just like other pattern recognition modules it’s open to error.
When we predict the future we’re making an inference. We take what we already know and imagine how things we already understand will play out over time. Sometimes our inferences are based upon sound evidence but not always. Sometimes they’re based upon beliefs and ideologies that are not remotely evidence-based. These inferences are arbitrary.
The trouble with Arbitrary inference is that a single error can have dramatic consequences as each new assumption compounds the problem. Consider the following example…
John and Mary are engaged. They have been together for 5 years (living together for three years) and plan to wed in a few months. One night Mary tells John that she’s going out with some old friends. John makes several arbitrary inferences….
Mary’s going out with her old friends
That crowd used to go out ‘on the pull’
Mary’s going out to get laid
Mary can’t be trusted
Mary doesn’t love me
Mary wants to break it off
Mary will leave me soon
If Mary dumps me I’ll be humiliated
I should dump Mary first
Clearly this is a fairly dramatic example but these things do happen. The problem isn’t with our tendency to recognise patterns though, it’s with our tendency to accept patterns uncritically. The thing all these psychological spandrels have in common is their lack of critical thinking.
If we are to overcome our evolved tendencies to mislead ourselves and those around us we must begin by developing the habit of true deliberation and by accepting the fact that our initial assumptions may be wrong.
Filed under: Hard wired, Uncategorized | Tagged: arbitrary inference, assumption, confirmation bias, evolution, Gould, prediction, psychology, selective abstraction, spandrel, status | Leave a Comment »