Site pictogram Blogcollectief Onderzoek Onderwijs

20/20 Blindsight or There is none so blind as (s)he who will not see!

It’s fun to work on a book because in doing that you also learn quite a lot. At the moment I just finished writing – along with two colleagues Pedro de Bruyckere and Casper Hulshof – a book about Urban Legends in education and learning (soon to be published by Elsevier / Academic Press). While researching why myths are so perseverant I came across a number of different reasons from different sources. Here is a short list of my favourites: 

Patternicity: First used (actually coined) by Michael Shermer, patternicity is the tendency to find meaningful patterns in random noise. It is strongly related to the phenomenon apophenia, a neologism created by Klaus Conrad to describe the early onset of schizophrenia in which there is an “unmotivated seeing of connections” accompanied by a “specific experience of an abnormal meaningfulness”. Shermer argues that our brains are “belief engines” which have evolved to be pattern recognition machines capable of connecting the dots and creating meaning out of “the patterns that we think we see in nature”. The problem here is that while two or more things are sometimes really related to each other (e.g., when a black panther crosses your path, you could get eaten), sometimes they are not (e.g., when a black house cat crosses your path, bad luck will ensue). When the two things are really related (that is, the pattern really exists), we have learned something that we can use in the future to make “predictions that aid in survival and reproduction”. Good pattern recognizers survived (i.e., they fled the panther and were, thus, not eaten). 

Agenticity: Also coined by Shermer, agenticity is the bent to believe the world is controlled by invisible intentional agents. The “tendency to infuse patterns with meaning, intention, and agency”. When the sun ‘disappears’ during a total solar eclipse, getting on your knees and praying or sacrificing a virgin in a volcano or something else right then will appease the gods and cause them to have the sun reappear – which it eventually does. We – as humans – need this because we need an explanation for things that happen yet don’t understand and if this is the case, we ascribe the cause or control of that thing to an ‘agent’. This is understandable. Because we are humans, we view the world from within our own human context, where we have the idea that everything that we do is determined by us. And in situations where we have no control over what is happening, we then ascribe what has happened / is happening to an agent outside of us. Sometimes the agent has a scientific basis (e.g., movement/orbits of moons and planets, bacteria), but when these are absent, yet unknown or we are too ignorant we ascribe what has happened to agents (e.g., deities, humours). These agents fill in the gaps in our understanding, allowing us to ‘understand’ a complex and sometimes difficult to understand world. 

Confirmation bias: We are not objective. We tend to seek and also find confirmatory evidence for what we already believe. Confirmation bias is the tendency that we have to favour information that confirms our beliefs or hypotheses. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. Confirmation bias – also sometimes referred to as confirmatory bias – is the tendency to remember (i.e., selective recall), search for and/or interpret information in such a way that it confirms our preconceptions. It leads us to make inferences in a specific direction, namely towards confirming what we already believe and disproving what we don’t. In doing this we actively seek out, remember and assign more weight to evidence confirming, substantiating or validating our hypothesis, and ignore or assign less weight to evidence that could do the opposite. As climate change denier, we see the last cold winter as ‘proof’ that the atmosphere is not warming up disregard evidence to the contrary. As such, it is actually a form of selection bias in the evidence we see and collect. 

Hindsight bias: The New York Yankee catcher and manager Yogi Berra (and/or the quantum physicist and Nobel prize winner Niels Bohr!) once said “It’s tough to make predictions, especially about the future”. The opposite, however, is fairly easy. Hindsight bias is the tailoring of after-the-fact explanations to what we already know happened. It sums up our proclivity for or predisposition towards, after an event has occurred, feeling that we could or should have predicted the event all along. This occurs despite the fact that there was little or even no objective basis for predicting it, prior to its occurrence. After the fact, all of the pieces fall into place while beforehand we didn’t know what the pieces were or even that there were pieces. What beforehand was seen as a series of coincidences becomes, after the fact, a deterministic chain of events. As such, it is related to patternicity in that we connect the dots of occurrences to create a coherent series of events leading to an inevitable end. In case-based research, for example, we know the outcome and then may interpret what has occurred in the case – correctly or incorrectly – in line with the observed outcome. In such cases, not only hindsight bias, but also patternicity and confirmation bias can be at play. Together, these form a very poisonous cocktail! 

Actor-Observer bias: This bias in how we think about and see things seems to be fairly basic to our nature as humans and may play an important role in our difficulty, inability and/or unwillingness to let loose of things that don’t work; that is, to perpetuate myths and legends. The actor-observer, which is very similar to what we see in attribution theory, refers to our tendency to attribute our own actions and the results thereof to external causes, while attributing other people’s behaviours to internal causes. The actor-observer bias tends to be strongest when the outcomes are negative. For students we see, for example, that when (s)he does poorly on a test, (s)he finds situational factors to justify it such as saying “The teacher asked a question that was never covered in class. However, if some other student does poorly on a test, that same person attributes the failure to factors such as laziness and inattentiveness in class. For teachers, in a situation where (s)he makes use of certain teaching approach and it doesn’t work, the teacher often tends to blame the student for this: the student didn’t listen, didn’t try hard enough, is too dumb to learn it, et cetera. The teacher attributes (and that is why this is very strongly related to the attribution error) the failure to the learner for her/his personal choices, behaviours, disposition and/or actions. 

Groupthink: Groupthink (though often attributed to Irving Janis, the term was actually coined by William Whyte in 1952 in Fortune magazine) leads to a type of bias in which group pressures can cause the group (and the group members) either not to see or to ignore alternatives. Humans are social in nature. As such they influence and are influenced by others. Members of a group have a desire for harmony or conformity in the group. Humans also live and function within one or more cultures; a group of people who share beliefs, values, customs, and practices and this sharing, in turn, influences the members of that group. If we all do something in a certain way – that is to say it is popular within a group – then as humans we tend not question it because “that’s the way it is”. According to Bo Bennett in his book Logically Fallacious, “Using the popularity of a premise or proposition as evidence for its truthfulness…is a fallacy which is very difficult to spot because our “common sense” tells us that if something is popular, it must be good / true / valid”. 

Repetition: Related to groupthink is the repetition of a myth, an act which strengthens the myth. Weaver (a researcher at Virginia Tech) studying how ‘a repetitive voice can sound like a chorus’ found that when we repeatedly hear the same information, we believe it to be true. Why? Because as we all learned in our Pedagogy 101 course, repetition causes the brain to remember something more effectively (the old adage common to presenters and teachers: I’ll tell you what I will present, I present, and then I tell you what I presented). Unfortunately with respect to myths and urban legends, the human brain is tricked into assuming that the memorized fact is true. Also, hearing something repeatedly deceives our brain into thinking it’s heard the information from many different sources (i.e., the repetitive single voice becomes a chorus of voices). 

I’ll close this blog here with two quotes that I feel are a propos:

Man’s mind is so formed that it is far more susceptible to falsehood than to truth. Desiderius Erasmus

Everyone is entitled to his own opinion, but not his own facts. Daniel Patrick Moynahan

0 0 votes
Article Rating
Mobiele versie afsluiten